rjbs forgot what he was saying

not logged in (root) | by date | tagcloud | help | login

rjbs's tags

-- - ?? + ++
4   80s  
4   _to_read  
1   abe.pm  
7   addex  
1   ads  
26   advice  
1   airlines  
1   algorithm  
9   amazon  
1   amber  
1   america  
1   animals  
1   aol  
43   apple  
7   applescript  
2   architecture  
1   arduino  
1   arf  
5   art  
1   assembler  
5   astronomy  
2   baby  
4   backup  
2   baking  
1   banjo  
2   barcode  
1   bash  
1   battletech  
6   beer  
2   bethlehem  
5   bible  
4   bicycle  
6   blog  
1   bonjour  
11   book  
49   books  
9   booze  
1   boston  
1   brainfuck  
1   breadmachine  
1   bryar  
1   bugzilla  
1   bus  
2   c  
1   calculator  
1   calendar  
33   campaign  
1   car  
4   cartoons  
1   cdbi  
1   cellphone  
6   chart  
1   chemistry  
14   chess  
1   china  
1   chinese  
20   christianity  
2   christmas  
1   chrome  
1   cloud  
6   cocoa  
16   code  
15   color  
9   comics  
7   compsci  
1   computers  
1   conference  
2   convention  
1   cooking  
22   cpan  
1   cricket  
2   criticism  
1   cron  
3   crossword  
4   crypto  
3   css  
11   culture  
1   cvs  
1   cygwin  
1   dad  
1   dashboard  
8   database  
2   dbi  
1   dcbpw  
1   debian  
2   debug  
8   delicious  
3   design  
2   dice  
2   dictionary  
27   distzilla  
1   diy  
79   dnd  
2   dns  
1   drawing  
4   dreamcast  
12   dreams  
1   drm  
1   dropbox  
1   dvorak  
4   ebook  
1   ebooks  
2   economics  
8   editor  
2   emacs  
67   email  
1   encoding  
4   english  
3   ergo  
1   erlang  
9   esperanto  
2   etiquette  
1   evolution  
1   exchange  
2   exercises  
1   extreme  
3   family  
1   fax  
1   fbl  
2   ff  
2   fiction  
6   firefox  
1   flags  
10   flash  
2   fletch  
1   flickr  
1   fluxx  
1   foaf  
1   folk  
11   fonts  
20   food  
5   forth  
2   forum  
1   free  
2   freesoftware  
2   friends  
1   friendship  
1   frink  
1   fud  
3   functional  
1   fundraising  
2   furniture  
8   game  
3   gameboy  
18   gamecube  
212   games  
50   gamesite  
1   gaming  
1   geography  
2   geometry  
1   german  
20   git  
1   github  
1   glasses  
1   gloria  
1   gmail  
15   go  
2   golf  
3   goof  
6   google  
9   gov  
1   groups  
11   guineapigs  
2   gun  
3   gwb  
7   haiku  
2   halo  
49   hardware  
10   haskell  
1   hate  
1   health  
1   high-st  
3   hiring  
15   history  
7   hiveminder  
1   home  
14   house  
16   howto  
7   html  
2   http  
35   humor  
1   icehouse  
2   icon  
6   icons  
4   idea  
1   ideas  
3   illusion  
16   image  
1   infocom  
11   infocom-replay  
2   inform  
17   int-fiction  
1   io  
1   ipad  
3   iphoto  
6   ipod  
5   irc  
10   itunes  
1   jargon  
2   java  
28   javascript  
4   jonstewart  
1   jott  
1242   journal  
1   jquery  
1   json  
1   karaoke  
15   keyboard  
5   keynote  
1   kinesis  
1   knave  
3   kwiki  
10   language  
2   lasertag  
2   latex  
2   latin  
6   law  
1   lazyweb  
2   lego  
2   library  
1   lighttpd  
11   linux  
4   liquidplanner  
10   lisp  
1   list  
2   literature  
7   logic  
1   logo  
2   lovecraft  
1   lua  
116   macosx  
1   magazine  
1   make  
2   management  
2   map  
4   maps  
6   mario  
1   markdown  
6   martha  
10   math  
3   mecha  
1   media  
2   memory  
1   metabase  
2   metroid  
2   mh  
1   microscopy  
1   microsoft  
1   minecraft  
1   mk  
12   mnm  
4   money  
3   moose  
2   motivation  
1   mouse  
24   movies  
3   mozilla  
1   mp3  
5   msie  
23   music  
10   mutt  
2   mysql  
1   mythology  
1   negativland  
1   nethack  
23   network  
9   networking  
8   news  
1   oauth  
2   ocaml  
3   omnifocus  
1   omniweb  
2   online  
1   ook  
1   openid  
1   oracle  
8   oscon  
1   oscon2008  
1   outliner  
7   paranoia  
2   parsing  
1   pdf  
6   pedagogy  
3   pennsylvania  
363   perl  
1   perl6  
2   perlmonks  
1   perlqa2011  
1   perlqa2012  
1   perlqa2013  
1   perlqa2014  
1   perlqah2015  
7   pets  
2   philosophy  
22   phone  
1   photos  
8   php  
1   physics  
1   piercing  
1   piet  
1   pike  
5   platformer  
1   pobox  
10   pod  
3   poetry  
11   politics  
1   porn  
1   portland  
2   postfix  
2   postgres  
2   ppw2007  
1   pr0n  
6   presentation  
1   printer  
32   productivity  
2   profiling  
1   programing  
452   programming  
5   prolog  
7   ps2  
2   psx  
1   psych  
1   pvoice  
11   python  
1   qmail  
1   quality  
1   querylet  
2   quiz  
2   rant  
3   rants  
2   reading  
3   recipe  
56   reference  
1   regex  
12   religion  
7   repair  
1   replication  
1   resource  
3   rest  
18   retail  
11   review  
1   rhetoric  
1   rifts  
1   robot  
1   rpc  
124   rpg  
5   rss  
3   rtf  
1   rubik  
20   rubric  
17   ruby  
1   rugby  
1   russia  
1   rust  
6   rx  
2   safari  
1   satire  
1   scala  
2   scheme  
8   science  
1   sco  
2   screen  
2   search  
11   security  
2   sega  
3   service  
4   sh  
5   shaving  
1   sheetmusic  
1   slack  
1   slony  
2   smalltalk  
1   smoking  
1   social  
13   socialnetworking  
184   software  
1   solaris  
2   sonic  
2   space  
8   spam  
1   speech  
2   sports  
1   sql  
2   sqlite  
1   startrek  
3   starwars  
2   steelbat  
1   strategy  
68   stupid  
10   subversion  
1   sudo  
1   superhero  
3   svk  
1   switcher  
5   syntax  
1   tarot  
1   taxes  
17   testing  
1   tex  
1   thanksgiving  
1   theatre  
1   thunderbird  
7   tiddlywiki  
6   time  
5   tivo  
7   todo  
1   tolkien  
55   tool  
1   tools  
1   toys  
15   travel  
1   trek  
25   tutorial  
16   tv  
3   typing  
1   uk  
6   unicode  
9   unix  
1   usenet  
2   vcs  
1   vector  
2   vi  
10   video  
94   videogame  
14   vim  
1   virtual  
1   virtue  
2   visualization  
2   vnc  
1   vocabulary  
1   voip  
2   voting  
6   war  
1   weapons  
1   weather  
58   web  
1   webgames  
1   weight  
2   wii  
20   wiki  
1   wikipedia  
14   win32  
2   wireless  
1   wodehouse  
1   wordplay  
19   work  
3   writing  
20   wtf  
19   xbox  
1   xcode  
1   xen  
8   xml  
1   xp  
1   xslt  
2   xul  
1   yahoo  
3   yaml  
29   yapc  
1   yapcasia  
5   ywar  
4   zelda  
1   zen  
3   zombie  
1   zombies  
5   zsh  

RSS feed rjbs's entries with a body

collapse entry bodies

I will make friends by programming. (body)

by rjbs, created 2017-01-15 22:25

Every once in a while I randomly think of some friend I haven't talked to in a while and I drop them an email. Half the time (probably more), I never hear back, but sometimes I do, and that's pretty great. This week, I read an article about Eagle Scouts and it made me realize I hadn't talked to my high school friend Bill Campbell for a while, so I dropped him an email and he wrote right back, and I was happy to have sent the email.

Today, I decided it was foolish to wait for random thoughts to suggest I should write to people, so I went into my macOS Contacts application and made a "Correspondents" group with all the people whom it might be fun to email out of the blue once in a while.

Next, I wrote a program to pick somebody for me to email. Right now, it's an incredibly stupid program, but it does the job. Later, I can make it smarter. I figure I'll run it once every few days and see how that goes.

I wrote the program in JavaScript. It's the sort of thing you used to have to write in AppleScript (or Objective-C), but JavaScript now works well for scripting OS X applications, which is pretty great. This was my first time writing any JavaScript for OSA scripting, and I'm definitely looking forward to writing more. Probably my next step will be to rewrite some of the things I once wrote in Perl, using Mac::Glue, which stopped working years ago when Carbon was retired.

Here's the JavaScript program in its entirety:

  Contacts = Application("Contacts");

  var people = Contacts.groups.byName("Correspondents").people;
  var target = people[ Math.floor( Math.random() * people.length ) ];

  var first = target.firstName.get();
  var last  = target.lastName.get();

  first + " " + last;

what I read in 2016 (body)

by rjbs, created 2017-01-03 10:46
tagged with: @markup:md books journal

According to Goodreads, which should be accurate, these are the books I read in 2016. I meant to read more, but I didn't.

The Pleasure of the Text

I'd been meaning to read this since college, as one of my favorite professors spoke highly of Barthes. I found the book unpleasantly difficult and basically uninteresting. I like to imagine that I would have enjoyed the book much more in the original French, as the language in translation felt pretentious. That said, I think the ceiling on my enjoyment was going to be pretty low. I'll stick with Derrida and Debord.

Eclipse Phase: After the Fall

Eclipse Phase is a transhumanist RPG with a lot of interesting ideas. This is a collection of stories set in that game's universe. I did not enjoy it. There were one or two good bits, but mostly it was just not good.

Puttering About in a Small Land

This is one of Philip K. Dick's realistic novels. He wrote a few of these before settling into all sci-fi all the time. I've found them to be surprisingly good. They're mostly just slice of life stories about silghtly unhappy people in mid-20th century California, but I have enjoyed them, as I did this one.

Frost: Poems

I was still making progress on my "one book of poetry each month" project! Frost was spectacular! Much better than I had anticipated. Reading him in middle and high school was too early. I couldn't yet appreciate the subtext of his works. I still think back on these poems now.

Ancillary Mercy

I finished the Ancillary Justice trilogy, and it was good. I seem to recall thinking that Mercy wasn't the best of the three books, but the whole trilogy was very good, and I'm glad to have read it.

The Girl with All The Gifts

Gloria recommended this to me a number of times, and I finally took her advice, and it was good advice. This was a good post-apocalypse story with an interesting premise. I think I was happy with the ending. They made a movie of it, but I haven't seen it yet.

Jane Eyre

I'm going to try to keep reading some classic 19th century novels over the next few years. I should've read more last year, but instead read only Jane Eyre. I feel I could've done a lot worse. It had problems (that ending!), but I really enjoyed it. It was charming and funny and well-written. Its position in the canon is well-earned. I look forward to reading Wide Sargasso Sea this year, too.


I felt like this book played on some of the same ideas as Ready Player One, but was a substantially better book. I found the character and premise more interesting than those in Ready Player One, which seemed to be built mostly on evoking nostalgia.

Programming Pearls

This is a book that other computer books recommended to me often enough that I asked for a copy, received it, and put it on my shelf for years. I mostly enjoyed it, although sometimes I found it a bit boring. The interesting bits were very interesting, though. I think my recommendation here is "read it, but skip the parts you don't find interesting."

The Fine Art of Mixing Drinks

I'd been nursing this book for a few years, and finally decided to just finish it. It's a good book to have a run through, although I wouldn't say it revolutionized my drink making.

Uncertainty in Games

This short book discusses the different kinds of uncertainty that can be introduced to games to make them more interesting. I enjoyed the overview, and it helped me think about what makes games that I like interesting (or not). It also didn't fall into the sort of stodgy prose that this sort of theoretical work often does.


This was a tolerable sci-fi story rendered intolerable by the writing of its characters. The romance and love scenes made me groan and roll my eyes. Skip it.

Dr. Adder

This book got a strong recommendation from Philip K. Dick, who said it would've defined cyberpunk if it had been published when written, instead of many years later. It definitely felt like a bridge between Dick and cyberpunk, but it was a big mess and spent a lot of its time reminding the reader how very transgressive it was. I'm glad I finally read it, but you probably shouldn't bother.

The Library at Mount Char

This might be the best recent book I read this year. It's about a group of weird people raised by a dangerous madman. The madman is missing, and they seem to be looking for him. It's pretty dark, but also funny in places. It was a good read.

Saturn Run

The US and China launch missions to Saturn (and back) after seeing evidence of an alien space ship there. The book had a lot going for it, idea-wise, but I found its characters uninteresting and its plot predictable.

Shadow of the Torturer

This is the first book of the Book of the New Sun books. I think they're amazing, especially in that they greatly reward repeat reading. I got a lot more out of them on this read than I did on my last, and I will surely read them again in a few years. Next time, I may take notes. This year, I plan to read Peace, another novel by Wolfe. This one also, I am told, rewards repeat reading, but it's only about 300 pages, which is a relief.

Starting Forth

I had read most of this book a few years ago, but never finished the exercises in the later chapters. This year I forced myself to do so, which led to re-reading a few chapters to get back my Forth legs. I'm very glad I did this, because it renewed my appreciation for Forth and got me to write a simple but instructive program in Forth.

Forth is great and more people should learn it. (Probably almost nobody should be using it for much real work, though.)

Claw of the Conciliator

This is the second part of the Book of the New Sun.

Thinking Forth

Stack Computers: The New Wave

Sword of the Lictor

This is the third part of the Book of the New Sun.

Citadel of the Autarch

This is the forth part of the Book of the New Sun.

The Urth of the New Sun

This is the fifth and final part of the Book of the New Sun.

The Sonnets (Berrigan)

The most compelling part of this collection is the introduction, which makes big claims. By the end of the book, I felt like maybe those claims hadn't been malarky, after all. The sonnets and general structure of the book became more interesting as it went on, and I began to see more of the simultaneity of the work's poems. I almost felt like starting over with that in mind. Almost.

Who Censored Roger Rabbit

This is the book on which Who Framed Roger Rabbit was based. The film is superior in every way. I am irritated that I spent time on this book.


This is a short story about how libertarianism isn't all that it's cracked up to be. It was good, but maybe I would've liked it better if Vance, rather than Heinlein, had written it.

Learn Vimscript the Hard Way

This is a good book on Vim. It didn't change my life, but I learned things.

The Skin

I'd had this book on my for years after a favorable review when the book was first published unexpurgated in English. It's a semi-real memoir of Curzio Malaparte's experience of the final months of WWII in Italy. There were parts where I felt certain that Joseph Heller must have read it before writing Catch-22, most especially in the rantings of a man that Italy had won by losing the war, just like the dirty old man in Catch-22's brothel.

I'm glad I read the book. It was an interesting read and definitely had its moments, but I can't say that I'd strongly recommend it to anybody else. It was disjointed and sometimes difficult to enjoy. Still, as I say, I'm glad I read it.

Also, I should admit that one of the things that bothered me is the amount of dialog in French. It made me feel poorly educated, which I am.

Something Happened

This is Joseph Heller's second book, published 13 years after Catch-22. From Wikipedia:

Something Happened has frequently been criticized as overlong, rambling, and deeply unhappy. These sentiments are echoed in a review of the novel by Kurt Vonnegut Jr., but are balanced with praise for the novel's prose and the meticulous patience Heller took in the creation of the novel.

I agree with those remarks, except for the idea that the prose and patience balanced out the book's painful length. (It's only about 500 pages, but they're long pages.) There was a lot that I really liked about this book, but eventually I couldn't stay engaged and skimmed my way to the irritating conclusion.

Seven Databases in Seven Weeks

I found this about as good as the Seven Languages books. In other words: it was ... okay. It was a tolerable crash course, but I wasn't interested enough to do the exercises. Maybe doing them would've helped, but the book itself didn't get me interested enough to bother. Also, I have generally felt, in these books, that none of the authors has a really interesting narrative or voice.

Still, I understand Neo4j a lot better, now.


This is the other short story collection by Greg Egan. In general, I thought every story in it was a failure. Some had promise, but most were lousy, especially compared to his other (better, but still flawed) story collection, Axiomatic. I did sort of like Reasons to Be Cheerful, but not enough to make the book worth reading. Maybe not even enough to make the story worth reading.

Wyst: Alastor 1716

After Luminous, I wanted to read something I was guaranteed to enjoy. Clearly, I thought, I should read some Jack Vance. I asked Mark Dominus for a recommendation, and this was the shorter of the two he recommended. I enjoyed the heck out of, because Vance is a delight.

Discovering Scarfolk

This is a book from the Scarfolk blog, which presents surreal artifacts and articles from a fictional 1970's English town, Scarfolk. It was a fun and quick read, but I find the blog more fun.

The Informers

While on business in Melbourne, I got thinking about Bret Easton Ellis. Years ago I decided that I needed to space out my reading of his books, so that I wouldn't run out of books too quickly. He's one of my favorite authors, although I'm not sure I could say why. It has something to do with the very, very precise kind of bleakness he presents.

The Informers is (I think) his fourth book, and the fifth that I've read. It's a short story collection, and each story is very very loosely connected to some of the other stories in the book. This mirrors his usual practice of tying his books together with very thin threads. I found that a story collection was a great format for him, because it was able to further spread out the pointlessness depicted. It didn't much motion at all, because the break between stories was a substitute for any actual rising or falling action.

I was very pleased with my decision to read it.

The Speechwriter

This is a political memoir by one of the speechwriters for [https://en.wikipedia.org/wiki/Mark_Sanford], former governor of South Carolina. I had read that it provided some great insights into American politics, but it didn't. On the other hand, it was a light, enjoyable read. It was well written and funny when it wanted to be.

The Plagiarist

In a desperate attempt to get one more book read before 2016 ended, I decided to read this sixty-four page novella by Hugh Howey. It's a sci-fi story about a man who visits computer-generated worlds so that he can steal their original works of literature. It was mediocre.

Horror Movie Month 2016 (body)

by rjbs, created 2016-11-02 23:21
last modified 2016-11-02 23:22
tagged with: @markup:md journal movies

Another year, another thirty-one days of horror movies! I think our selections this year had fewer losers than some past years, but probably also fewer stand-out winners.

What We Watched

October 1: The Thing (1982)

A classic! I forgot how good the creature effects were, and how effective the moments of suspense. Also, as with all of Carpenter's original scores: great music!

We watched this one with Martha, who said it was much less boring than the 1951 version, which we previous tried to watch and abandoned, in preparation for this one. On the other hand, she said it was not scary.

October 2: Don't Breathe

We saw this one in the theatre!

It had a lot going for it, but in the end, we thought it was only okay, at best. I think I might have given it a "pretty good, if flawed," if it hadn't basically become a super-gross rape movie.

No, thanks. We want fun horror movies.

October 3: Paranormal Activities: Ghost Dimension

a.k.a. Paranormal Activities 6

It was the best Paranormal Activities movie in a long time. This does not mean it was very good. I liked a few things that it did, especially in how it tied itself back to the original film. Still, though, these movies are just not that great. I think it was about as good a capstone as we were going to get.

October 4: Silent Scream

We've spent some time working through a BuzzFeed list of horror movies. Some of the movies we'd already seen, and some we saw over the last two years. Now we're down to movies that I've had a hard time finding. Silent Scream is one of those.

Short version: it wasn't worth it. It had a couple funny bits, but mostly it was one of those 1980-ish films where they were trying to figure out how to make a slasher the right way. This one didn't hit the mark.

October 5: The Gate

Some kids stumble across a hole in the back yard. Their parents go away. The hole is full of tiny demons. They spend the movie fighting the demons before finally banishing them. It's from 1987, and you can tell. It's time to market horror-style movies to young audiences, and that's what this. It's not very good.

Instead, watch Joe Dante's The Hole.

October 6: He Never Died

Don't read the Wikipedia article! Just go see the movie!

This was probably my favorite new film of the month. It wasn't perfect, and as the movie went on it got less interesting to me, but Henry Rollins was just perfect. He plays a weird loner who lives in the big city and tries to avoid doing anything interesting. He eats eggplant parms every day and plays bingo a few times a week. There is something seriously weird going on with this boring guy.

I enjoyed it.

October 7: Extraordinary Tales

It's an anthology of animated Edgar Allan Poe stories. I was entirely unimpressed. Read them instead.

October 8: We Are Still Here

Middle-aged couple moves into a haunted house. Their neighbors visit but are sort of creepy weirdos. Friends come to visit and things get worse. On its surface, this movie didn't look very interesting, but I enjoyed it. It had good pacing. It was creepy. I wish the story held together a bit better, but it was fun. I would watch a sequel.

October 9: The Others

This was our second horror movie with Martha this year!

Gloria and I had seen this before, and I remembered thinking it was okay. On rewatching, I found it more interesting. It was a pretty solid ghost story.

Martha's verdict: good movie, not scary.

October 10: Ava's Possessions

The movie starts when a young woman has a possessing demons exorcised from her. She's found guilty of crimes she committed while possessed, and one option she's given is to go into a group therapy program for possession victims. This is a good setup, and they do not entirely squander it. I think I would've liked the movie better if there had been less horror action and more frank discussion of the problems of being possessed.

Still, I enjoyed it.

October 11: American Horror Story: Roanoke

Promising start. We liked the format as a ghost stories TV show.

October 12: Green Room

So, it was okay. A lefty punk band gets booked to play a white supremacist skinhead club. Things go sour.

We were excited because it had Patrick Stewart. He was wasted. The rest of the cast was good, though.

In the end, it was just a "everybody tries not to get killed" movie with a touch of torture porn. I was sure it was going to be a werewolf movie, and I was sorely disappointed when it wasn't.

October 13: American Horror Story: Roanoke

Uh oh. The usual American Horror Story thing is starting: too many plot lines, too many characters. What's going on here? Can this possibly remain coherent?

October 14: Southbound

An anthology! I love a good anthology, but so many horror anthologies are just crap. This one was not! It wasn't perfect, but it was good. There were five stories, each one linking to the next. I liked every story (but The Accident may have been my favorite), and I thought they were just connected enough to make it fun.

October 15: Krampus

It's a horror-comedy about Christmas. Krampus, Santa's evil buddy, comes to punish the grinchy. It was okay. It was not great, nor was it terrible.

October 16: The Fog (1980)

Another film that we'd seen before, but now watched with Martha! I like a lot about The Fog, and there's also a lot that I don't like. Maybe my big problem is that I find ghost pirates altogether too corny as villains.

Martha's verdict: she liked it, even though it wasn't scary.

October 16: Big Bad Wolves

I was a little worried when I realized this one woudl have subtitles, but I didn't mind. Also, this movie sounded very good. That is: the spoken Hebrew sounded really good in the mouths of these characters, even if I didn't understand a word of it.

The movie was well made, and I liked quite a few things about it, but its effective black comedy was undermined by the fact that it was about someone who raped and murdered children.

October 17: American Horror Story

Ugh, we give up. Too much stuff going on. Even if some of it was good, it was too much of a mess.

My proposal was that they try to do this series differently. Instead of acting like it's a whole season of a ghost stories TV series, they should've run a new ghost story each episode, slowly letting the viewer realize that they were somehow connected. Maybe by the end, we break the fourth wall and realize that the horror is now hunting the producers of the show.

Except the producers of American Horror Story would have to add eighteen subplots to the story.

October 18: Glitch

This ended up not seeming very horror-y. It's a new-to-us Australian TV series about a very small number of the dead rising from their graves. It looked pretty good, and we'll definitely watch more of it.

October 19: the third presidential debate

Okay, we skipped any traditional horror viewing to watch the debate between Clinton and Trump. I was left shaken.

October 20: Viral

Viral is an outbreak movie in which a parasite begins to spread rapidly, turning people into murder monsters. The film focuses on the experiences of two sisters trapped in a quarantined LA suburb during the outbreak while their parents are away. It was okay, but I felt like it could've been a lot better with some more work on the script.

October 21: Night of the Living Deb

Weird-o awkward protagonist has a one night stand with the guy she's been crushing on for ages. When they wake up, zombies have taken over their town. As they try to escape they discover what's really happening to their city… and their hearts.

It was okay. It really needed better writing. It occasionally felt like an SNL sketch.

October 22: Detention

We watched this one a few years ago, too. I really, really enjoyed this movie. I liked that it was weird, and unlike anything else, but not pretentious or self-important. It's just fun.

October 23: The Blob (1988)

Another movie watched with Martha! I hadn't seen this movie for maybe twenty years or more. When we watched it, I became sure of something I had forgotten: there was a novelization of this movie, and I read it. I can't be sure, but I feel pretty confident that this is true. I would've been about ten, so I probably thought it was awesome.

Anyway, the movie was cheesy, but not terrible. It was definitely much better than the 1958 version, which I considered watching with Martha a year or so ago. That was tiresome enough that I gave up during my previewing.

This one had some good bits. I especially liked the line cook being pulled bodily down the sink drain, making it bulge like a snake eating a rat.

Martha's verdict: fun, but not scary.

October 23: Insidious: Chapter 3

I may have liked this best of all the Insidious films. A teenage girl wants to contact the spirit of her dead mother, but instead gets the attention of malicious spirits living in her building. A friendly medium is called in to help.

There wasn't really anything in particular that I liked about this movie, I just liked it. I liked the cast, and I was pleased that the movie was neither very weird nor overly hampered by its formula.

All that said, it wasn't great. It was okay.

October 24: The Conjuring

This movie had a lot of problems. I wanted to yell at the TV, "Why are you doing this stupid thing that is obviously going to get someone killed?" Despite this, it was a decent horror movie. The creepy parts were creepy, the relaxing middle parts were relaxing, and it wasn't just jump scares.

Gloria and I are both pretty tired of possessions and haunted houses, though. The ideas aren't spent, but they've both been covered the same way in many, many films, especially in the last ten years. We need to see new ideas.

October 25: The Strain

We watched the first episode of this TV series about vampires invading New York City. (This is my understanding about what the series will be about. Even as I write this, we've only made it to episode three. I'm not sure what's really going to happen.)

I found the first episode sort of interesting, but also a mess. I didn't care enough to keep watching, but I felt like if I watched another episode, I might realize that then I'd want to keep watching. This turned out to be true.

I read somewhere that the producers of The Strain kept wanting to push the limits to see what they could get away with. I can see that, in the show.

October 26: The Purge: Election Year

Each subsequent Purge movie gives us more information about what life is like in the world of The Purge. This is a mixed blessing. On one hand, we want to know how the world has come to this, and what is really going on in America. On the other hand, the movie's answer is stupid and doesn't amount to much more than "bad stuff happened I guess."

I would like to see a short run TV series about The Purge, giving us better stories of how it started, what life is like when the Purge isn't going on, and how cleanup and recovery works the next day.

Just watching people try to make it across town while the laws are on hold gets old, and I could just go watch Escape from New York instead.

October 27: Sinister

Dude moves into a murder house to write about it. He discovers home movies of all kinds of awful murders. He realizes that there is a supernatural force that has been killing families for years. He tries to escape, but he can't, because there is a TWIST!

I liked a lot of things in this movie. It had plenty of good creepy bits. We really liked James Ransone's character and his scenes with Ethan Hawke. In the end, the explanation was not great. Vincent D'Onofrio's character makes things much more confusing than they needed to be.

I was pretty willing to watch more, though. So we did.

October 28: Sinister 2

We were sold on this when we found out that James Ransone's character from the first film would have a major role. Unfortunately, I found this one a lot less interesting. We mostly watched the lives of the two kids, and it was more unpleasant than interesting. The first movie did a good job of making it clear that creepy things happened to kids, but we didn't have to watch. In the second movie, we had to watch a lot of unpleasant kid scenes. Meh.

October 29: Ash vs. Evil Dead

I didn't know this series existed until we heard about it on NPR. We've only watched one episode so far, but I enjoyed it. It was very reminiscent of the original films. I know this shouldn't be surprising, given the people involved, but I was worried.

Gloria said, "It's hard to imagine that this kind of thing will work for two whole seasons." I agree, but I look forward to finding out what happens.

October 30: Poltergeist

Martha had been begging to watch this movie for over a year. Why oh why would we not let her watch it? She would not be scared. So, after a month of "this movie isn't scary" we said she could watch Poltergeist with us.

We settled in to watch the movie. Children were eaten by trees, vanished into televisions, attacked by clowns. Parents floated amid rotting corpses in the pool. A guy peeled his own face off.

Martha's verdict: Good movie, not scary.

I guess I'm just glad she liked it.

October 31: The Witch

I really liked It Follows, and thought it was a good movie in that every part of it was focused on creating a particular mood, and it worked. I heard people compare The Witch to It Follows just for this reason, so I was keen to see it.

Gloria and I both found it boring and uninteresting.

We didn't like the characters. We didn't find the evil things creepy. We didn't think the conflicts were interesting. I was especially irritated by the parents were so religious that it made them stupid. I tweeted this, and of course got replies in support of the idea that all religious people are stupid. Putting this tiresome idea aside: it doesn't make good viewing.


Is this the last 31 movie Horror Movie Month? Maybe.

We've found fewer great new horror movies, the last few years, and often best ones we find, we watch during the year, meaning that October is a bit of a slog. This year was much better, but I'm not sure we'll have that luck in 2017. Maybe we'll switch to mostly watching horror movies with Martha in October instead.

We've got a good eleven months to decide, though.

Horror Movie Month 2015 (body)

by rjbs, created 2016-10-29 19:57
last modified 2016-11-02 21:15
tagged with: @markup:md journal movies

So, it turns out I never posted a summary of our Horror Movie Month for 2015! I tried to recreate our viewing list by looking up my tweets, but there are a bunch of days with no movie tweet. What happened? I'm not sure. Anyway, here's a very late summary, missing days, and with a year's clouding of my memory. How accurate will it be? Who knows!

If I was more dedicated, I'd go find all my tweets from these days to reconstruct my thinking at the time. But I'm not.

The Whole List

October 1: Incident On and Off a Mountain Road

This was the first of several Masters of Horror films that we watched. As with most of them, I thought it had an okay idea that didn't really work out. We were annoyed that the movie seemed to be setting up one kind of scenario, but then it was something else entirely.

October 2: Cam2Cam

There have been a few horror movies told through video chat. This was one of the better ones, but it still wasn't great. I guess I don't regret it.

October 3: All the Boys Love Mandy Lane

All the Boys Love Many Lane, too, wasn't great, but I liked the ways it defied or subverted several genre norms.

October 4: Scream


It was still very good. I think this is probably one of my (and our?) favorite slasher movies, because it's scary, funny, and smart. Like many of the best slashers, though, it only works because you know the genre tropes. That's okay, though, because we all do!

October 5: Afflicted

Afflicted was a good take on the "something bad happens on a road trip" movie, and much better than many similar movies I've seen. Found footage, though… even though it often makes sense in the movies being made, it's a technique that needs a long rest.

October 6: 13 Sins

I really liked 13 Sins. It wasn't great, but I enjoyed its little twists, and it worked as a (very) dark comedy.

October 7: American Horror Story

We hated the 2015 season of American Horror Story and gave up after a couple episodes. The story was a mess, there was too much going on, and we didn't care about any of it. Also, we found a number of the actors' performances to be really dull.

October 7, later: Late Phases

I liked it. It wasn't the best thing ever, but I liked the characters, I liked the story, and I liked the way its tone varied into places we don't often see in movies like these. There's a great scene in which the protagonist is riding to town with a busload of (other) senior citizens and nothing much happens.

October 9: Open Grave

I don't remember it well. My recollection is that it had a bunch of really tired tropes, but did okay despite them… but it wasn't great. "Everyone is locked in a room and has amnesia" is something we've probably had enough of.

October 10: Scream 2

It was good!

October 11: Housebound

Gloria had already seen this and consented to watch it again. Why? Because it was good! Gloria likes horror comedies, and so do I, but I keep thinking we should watch serious ones, too, and I'm almost always wrong. In Housebound, a young woman in New Zealand is sentenced to house arrest and slowly begins to realize that the house is haunted. Hollywood would probably make this movie really gritty and creepy and shot in lots of sepia tones. Housebound was funny and surprising and weird. It took a couple twists that surprised me and made me laugh. Endorsed!

October 12: Curse of Chucky

I'd put it in the same bin as the earlier Child's Play movies, but better than most of them. That said, it was only okay, at best.

October 13: Hellraiser Ⅲ

Terrible. Awful. Why did they keep making these? I do mean to finish watching the series, but only because I am an idiot. The only thing I remember liking was "The DJ," a cenobite who throw compact discs with lethal force.

Really, it was just terrible.

October 15: Pick Me Up!

This was another Masters of Horror that didn't deliver on its potential. It was okay, and I'm glad I saw it, but I'd like to see it done over again. It's about two competing serial killers, which was a fun concept.

I learned about Masters of Horror existing because of this movie. I was looking for more films by Larry Cohen, whose work I've largely enjoyed.

October 16: Cheap Thrills

This was fun. Like 13 Sins, it's about some guy who gets told to do increasingly weird things to make money. It's also got a darkly comedic streak. I enjoyed it!

October 18: Mischief Night

Blind woman terrorized by masked killer. I barely remember it at all. We've seen enough of these movies that I do remember, so this one is probably best ignored.

October 19: The Houses October Built

Road trip! A bunch of friends are traveling around and looking for the scariest haunted house experiences around the country. There's this rumor that a super-secret one exists that is way scarier than you've ever seen before.

Sounds like a great concept, I thought! I found it disappointing.

October 20: Dreams in the Witch House (Masters of Horror)

This was one of the better Masters of Horror that we watched, but it was still mostly just okay. I enjoyed that it was a pretty faithful adaptation of Lovecraft, since so many adaptations change too much.

October 21: Razorback

This is a 1984 horror movie about an enormous bloodthirsty razorback boar that kills people. It's basically Jaws, but in the outback. This movie was not good, but it was pretty weird. It seemed like, "If this is even sort of what 1984 rural Australia was like, then Mad Max seems like a much more plausible vision of the future than I thought."

October 22: Jenifer

It's a Masters of Horror film. Like many of the others, it was interesting, but not really great. That said, it's probably my favorite piece of horror work by Dario Argento, whose films I usually find sort of overwrought. It was nice and creepy.

October 23: Case 39

I had entirely forgotten this film until I re-read the Wikipedia page just now. I can't say whether I liked it or note. I think, if I recall correctly, that it had a few good moments, but was otherwise mediocre.

October 25: Stonehearst Asylum

This movie had a good twist, but I think it maybe gave it up too soon… but it might have been pretty hard to keep it hidden for much longer. It had a lot of good talent (Ben Kingsley!), but it could've done more to be creepy and not just weird.

October 26: Nightbreed

I heard so many good things about this film, but I was really hesitant, because I'm not much of a fan of Clive Barker. It was so-so. The Nightbreed themselves were weird, but not very compelling. I would've rather re-watched Basket Case 2.

October 27: Cigarette Burns

This was probably my favorite of the Masters of Horror movies that we watched. It's very much in the spirit of H.P. Lovecraft, at least until the strange (but good) ending. It reminded me in many ways of Carpenter's earlier In the Mouth of Madness, which I think was (as I recall) a better film, despite a number of scenes in Cigarette Burns that were more nuanced and interesting than anything in Madness.

October 28: Vlog

I barely remember this. As I recall, it was much creepier than I expected, but had a bunch of other problems. It was one of the many (many? well, enough) webcam movies we watched in 2015.

October 29: Homecoming (Masters of Horror)

This was a dark comedy in which dead American soldiers rise from their grave as zombies, with just one desire: the vote. They plan to vote for anyone who will end the war. The film focuses on the campaign staff of the sitting president, who supports the war, and has to spin this the right way. The movie had a lot of problems, but it was good enough in other ways to overcome them.

October 30: The Fury

So, this was something like Carrie, or Firestarter, or Scanners. There's a secret government plan to weaponize psychics, and of course things go wrong. It was mediocre.

October 31: A Girl Walks Home Alone at Night

This was an Iranian film — I think. It was filmed in Persian, anyway, and directed by an Iranian woman. I remember it only faintly. I recall liking what they did with a cat.

October 31: The Stuff

This was my then-eight-year-old daughter's first admission to Horror Movie Month. It's a film by Larry Cohen, whose films I really like. In it, a weird white goo is found bubbling up from the frozen ground by someone who seems to be an industrial site night watchman. He looks at it quizzically and then, of course, tastes it. It's delicious!

Months later, The Stuff is the number one dessert in the country. People love it. They eat it for all three meals, plus snacks. It's a huge sensation, even if the ingredients just say "natural ingredients."

A young boy runs away from his family because they've lost their minds for The Stuff. He teams up with Michael Moriarty (who appears often in Cohen's film) to find the secret of The Stuff and stop it from consuming all the consumers in the world!

After the movie, we treated ourselves to some of that wonderful Stuff!

The Stuff

sending email with TLS (in Perl) (body)

by rjbs, created 2016-10-22 22:55

Every once in a while I hear or see somebody using one of the two obsolete secure SMTP transports for Email::Sender, and I wanted to make one more attempt to get people to switch, or to get them to tell me why switching won't work.

When you send mail via SMTP, and need to use SMTP AUTH to authenticate, you want to use a secure channel. There are two ways you're likely to do that. You might connect with TLS, conducting the entire SMTP transaction over a secure connection. Alternatively, you might connect in the clear and then issue a STARTTLS command to begin secure communication. For a long time, Perl's default SMTP library, Net::SMTP, did not support either of these, and it was sort of a pain to use them.

Email::Sender is probably the best library for sending mail in Perl, and it's shipped with an SMTP transport that uses Net::SMTP. That meant that if you wanted to use TLS or STARTTLS, you needed to use another transport. These were around as Email::Sender::Transport::SMTPS and Email::Sender::Transport::SMTP::TLS. These worked, but you needed to know that they existed, and might rely on libraries (like Net::SMTPS) not quite as widely tested as Net::SMTP.

About two years ago, Net::SMTP got native support for TLS and STARTTLS. About six months ago, the stock Email::Sender SMTP transport was upgraded to use it. Now you can just write:

my $xport = Email::Sender::Transport::SMTP->new({
  host => 'smtp.pobox.com',
  ssl  => 'starttls', # or 'ssl'
  sasl_username => 'devnull@example.com',
  sasl_password => 'aij2$j3!aa(',

...and not think about installing anything else. This is what I suggest you do.

I'm learning Rust! (body)

by rjbs, created 2016-10-15 22:45

I've been meaning to learn Rust for a long time. I read the book a while ago, and did some trivial exercises, but I didn't write any real programs. This is a pretty common problem for me: I learn the basics of a language, but don't put it to any real use. Writing my stupid 24 game solver in Forth definitely helped me think about writing real Forth programs, even if it was just a goof.

Now I'm working on implementing the Software Tools programs in Rust. These are simple programs that solve real world problems, or at least approximations of real world problems. I've written programs to copy files, expand and collapse tabs, count words, and compress files. So far, all my programs are pretty obviously mediocre, even to me, but I'm having fun and definitely learning a lot. At first, I thought I'd be working my way through the book program by program, but now I realize that I'm going to continually going back to earlier work to improve it with the things I'm learning as I go.

For example, I sarted off by buffering all my I/O manually, which worked, but made everything I did a bit gross to look at. Later, I found that you can wrap a thing that reads from a file (or other data source) in something that buffers it but then provides the same interface. I went back and added that to my old programs, deleting a bunch of code.

Soon, I know i'm going to be going back to add better command line argument handlng. I'm pretty sure my error handling is all garbage, too.

Still, the general concept has been a great success: I'm writing programs that actually do stuff, and they have fun edge cases, and it's just a lot less tedious than exercises in a text book.

So far, so good!

solving the 24 game in Forth (body)

by rjbs, created 2016-08-23 10:46
last modified 2016-08-23 18:41

About a month ago, Mark Jason Dominus posted a simple but difficult arithmetic puzzle, in which the solver had to use the basic four arithmetic operations to get from the numbers (6, 6, 5, 2) to 17. This reminded me of the 24 Game, which I played when I paid my infrequent visits to middle school math club. I knew I could solve this with a very simple Perl program that would do something like this:

  for my $inputs ( permutations_of( 6, 6, 5, 2 ) ) {
    for my $ops ( pick3s_of( qw( + - / * ) ) ) {
      for my $grouping ( 'linear', 'two-and-two' ) {
        next unless $target == solve($inputs, $ops, $grouping);
        say "solved it: ", explain($inputs, $opts, $grouping);

All those functions are easy to imagine, especially if we're willing to use string eval, which I would have been. I didn't write the program because it seemed obvious.

On the other hand, I had Forth on my brain at the time, so I decided I'd try to solve the problem in Forth. I told Dominus, saying, "As long as it's all integer division! Forth '83 doesn't have floats, after all." First he laughed at me for using a language with only integer math. Then he told me I'd need to deal with fractions. I thought about how I'd tackle this, but I had a realization: I use GNU Forth. GNU's version of almost everything is weighed down with oodles of excess features. Surely there would be floats!

In fact, there are floats in GNU Forth. They're fun and weird, like most things in Forth, and they live on their own stack. If you want to add the integer 1 to the float 2.5, you don't just cast 1 to int, you move it from the data stack to the float stack:

2.5e0 1. d>f f+

This puts 2.5 on the float stack and 1 on the data stack. The dot in 1. doesn't indicate that the number is a float, but that it's a double. Not a double-precision float, but a two-cell value. In the Forth implementation I'm using, 1 gets you an 8-byte 1 and 1. gets you a 16-byte 1. They're both integer values. (If you wrote 1.0 instead, as I was often temped to do, you'd be making a double that stored 10, because the position of the dot doesn't matter.) d>f takes a double from the top of the data stack, converts it to a float, and puts it on the top of the float stack. f+ pops the top two floats, float-adds them, and pushes the result back onto the float stack. Then we could verify that it worked by using f.s to print the entire float stack to the console.

Important: You have to keep in mind that there are two stacks, here, because it's very easy to manipulate the wrong stack and end up with really bizarre results. GNU Forth has locally named variables, but I chose to avoid them to keep the program feeling more like Forth to me.


I'm going to run through how my Forth 24 solver works, not in the order its written, but top-down, from most to least abstract. The last few lines of the program, something like int main are:

  17e set-target
  6e 6e 5e 2e set-inputs

  ." Inputs are: " .inputs
  ." Target is : " target f@ fe. cr
  ' check-solved each-expression

This sets up the target number and the inputs. Both of these are stored, not in the stack, but in memory. It would be possible to keep every piece of the program's data on the stack, I guess, but it would be a nightmare to manage. Having words that use more than two or three pieces of data from the stack gets confusing very quickly. (In fact, for me, having even one or two pieces can test my concentration!)

set-target and set-inputs are words meant to abstract a bit of the mechanics of initializing these memory locations. The code to name these locations, and to work with them, looks like this:

  create inputs 4 floats allot              \ the starting set of numbers
  create target 24 ,                        \ the target number

  : set-target target f! ;

  \ sugar for input number access
  : input-addr floats inputs + ;
  : input@ input-addr f@ ;
  : input! input-addr f! ;
  : set-inputs 4 0 do i input-addr f! loop ;

create names the current memory location. allot moves the next allocation forward by the size it's given on the stack, so create inputs 4 floats allot names the current allocation space to inputs and then saves the next four floats worth of space for use. The comma is a word that compiles a value into the current allocation slot, so create target 24 , allocates one cell of storage and puts a single-width integer 24 in it.

The words @ and ! read from and write to a memory address, respectively. set-target is trivial, just writing the number on the stack to a known memory location. Note, though, that it uses f!, a variant of ! that pops the value to set from the float stack.

set-inputs is built in terms of inputs-addr, which returns the memory address for given offset from inputs. If you want the final (3rd) input, it's stored at inputs plus the size of three floats. That's:

  inputs 3 floats +

When we make the three a parameter, we swap the order of the operands to plus so we can write:

  floats inputs + ( the definition of input-addr )

set-inputs loops from zero to three, each time popping a value off of the float stack and storing it in the next slot in our four-float array at input.


Now we have an array in memory storing our four inputs. We also want one for storing our operators. In fact, we want two: one for the code implements an operator and one for a name for the operator. (In fact, we could store only the name, and then interpret the name to get the code, but I decided I'd rather have two arrays.)

  create op-xts ' f+ , ' f- , ' f* , ' f/ ,
  create op-chr '+  c, '-  c, '*  c, '/  c,

These are pretty similar to the previous declarations: they use create to name a memory address and commas to compile values into those addresses. (Just like f, compiled a float, c, compiles a single char.) Now, we're also using ticks. We're using tick in two ways. In ' f+, the tick means "get the address of the next word and compile that instead of executing the word." It's a way of saying "give me a function pointer to the next word I name." In '+, the tick means "give me the ASCII value of the next character in the input stream."

Now we've got two arrays with parallel indexes, one storing function pointers (called execution tokens, or xts, in Forth parlance) and one storing single-character names. We also want some code to get items out of theses arrays, but there's a twist. When we iterate through all the possible permutations of the inputs, we can just shuffle the elements in our array and use it directly. When we work with the operators, we need to allow for repeated operators, so we can't just shuffle the source list. Instead, we'll make a three-element array to store the indexes of the operators being considered at any given moment:

  create curr-ops 0 , 0 , 0 ,

We'll make a word curr-op!, like ones we've seen before, for setting the op in position i.

  : curr-op! cells curr-ops + ! ;

If we want the 0th current operator to be the 3rd one from the operators array, we'd write:

  3 0 curr-op!

Then when we want to execute the operator currently assigned to position i, we'd use op-do. To get the name (a single character) of the operator at position i, we'd use op-c@:

  : op-do    cells curr-ops + @ cells op-xts + @ execute ;
  : op-c@    cells curr-ops + @ op-chr + c@ ;

These first get the value j stored in the ith position of curr-ops, then get the jth value from either op-xts or op-chr.

permutations of inputs

To get every permutation of the input array, I implemented Heap's algorithm, which has the benefit of being not just efficient, but also dead simple to implement. At first, I began implementing a recursive form, but ended up writing it iteratively because I kept hitting difficulties in stack management. In my experience, when you manage your own stacks, recursion gets significantly harder.

  : each-permutation ( xt -- )

    dup execute

    0 >r
      4 i <= if rdrop drop exit then

      i i hstate@ > if
        i do-ith-swap
        dup execute
        i hstate1+!
        0 hstate i cells + !

This word is meant to be called with an xt on the stack, which is the code that will be executed with each distinct permutation of the inputs. That's what the comment (in parentheses, like this) tells us. The left side of the double dash describes the elements consumed from the stack, and the right side is elements left on this stack.

init-state sets the procedure's state to zero. The state is an array of counters with as many elements as the array being permuted. Our implementation of each-permutations isn't generic. It only works with a four-element array, because init-state works off of hstate, a global four element array. It would be possible to make the permutor work on different sizes of input, but it still wouldn't be reentrant, because every call to each-permutation shares a single state array. You can't just get a new array inside each call, because there's no heap allocator to keep track of temporary use of memory.

(That last bit is stretching the truth. GNU Forth does have words for heap allocation, which just delegate to C's alloc and friends. I think using them would've been against the spirit of the thing.)

The main body of each-permutation is a loop, built using the most generic form of Forth loop, begin and again. begin tucks away its position in the program, and again jumps back to it. This isn't the only kind of loop in Forth. For example, init-state initializes our four-element state array like this:

  : init-state 4 0 do 0 i hstate! loop ;

The do loop there iterates from 0 to 3. Inside the loop body (between do and loop) the word i will put the current iteration value onto the top of the stack. It's not a variable, it's a word, and it gets the value by looking in another stack: the return stack. Forth words are like subroutines. Every time you call one, you are planning to return to your call site. When you call a word, your program's current execution point (the program counter), plus one, is pushed onto the return stack. Later, when your word hits an exit, it pops off that address and jumps to it.

The ; in a Forth word definition compiles to exit, in fact.

You can do really cool things with this. They're dangerous too, but who wants to live forever? For example, you can drop the top item from the return stack before returning, and do a non-local return to your caller's caller. Or you can replace your caller with some other location, and return to that word -- but it will return to your caller's caller when it finishes. Nice!

Because it's a convenient place to put stuff, Forth ends up using the return stack to store iteration variables. They have nothing to do with returning, but that's okay. In a tiny language machine like those that Forth targets, some features have to pull double duty!

begin isn't an iterating loop, so there's no special value on top of the return stack. That's why I put one there before the loop starts with 0 >r, which puts a 0 on the data stack, then pops the top of the data stack to the top of the return stack. I'm using this kind of loop because I want to be able to reset the iterator to zero. I could have done that with a normal iterating loop, I guess, but it didn't occur to me at the time, and now that I have working code, why change it?

Iterator reset works by setting i back to 0 with the zero-i word. In a non-resetting loop iteration, we increment i with inc-i. Of course, i isn't a variable, it's a thing on the return stack. I made these words up, and they're implemented like this:

  : zero-i r> rdrop 0 >r >r ;
  : inc-i  r> r> 1+ >r >r ;

Notice that both of them start with r> and end with >r. That's me saving and restoring the top item of the return stack. You see, once I call zero-i, the top element of the return stack is the call site! (Well, the call site plus one.) I can't just replace it, so I save it to the data stack, mess around with the second item on the return stack (which is now the top item) and then restore the actual caller so that when I hit the exit generated by the semicolon, I go back to the right place. Got it? Good!

Apart from that stuff, this word is really just the iterative Heap's algorithm from Wikipedia!

nested iteration

Now, the program didn't start by using each-iteration, but each-expression. Remember?

  ' check-solved each-expression

That doesn't just iterate over operand iterations, but also over operations and groupings. It looks like this:

  : each-expression ( xt -- )
    2 0 do
      i 0= linear !
      dup each-opset
      loop drop ;

It expects an execution token on the stack, and then calls each-opset twice with that token, setting linear to zero for the first call and 1 for the second. linear controls which grouping we'll use, meaning which of the two ways we'll evaluate the expression we're building:

  Linear    : o1 ~ ( o2 ~ ( o3 ~ o4 ) )
  Non-linear: (o1 ~ o2) ~ (o3 ~ o4)

each-opset is another iterator. It, too, expects an execution token and repeatedly passes it to something else. This time, it calls each-permutation, above, once with each possible combination of operator indexes in curr-op.

  : each-opset ( xt -- )
    4 0 do i 0 curr-op!
      4 0 do i 1 curr-op!
        4 0 do i 2 curr-op!
          dup each-permutation
          loop loop loop drop ;

This couldn't be much simpler! It's exactly like this:

  for i in (0 .. 3) {
    op[0] = i
    for j in (0 .. 3) {
      op[1] = j
      for k in (0 .. 3) {
        op[3] = k

inspecting state as we run

Now we have the full stack needed to call a given word for every possible expression. We have three slots each for one of four operators. We have four operands to rearrange. We have two possible groupings. We should end up with 4! x 4³ x 2 expression. That's 3072. It should be easy to count them by passing a counting function to the iteator!

create counter 0 ,
: count-iteration
  1 counter +!    \ add one to the counter
  counter @ . cr  \ then print it and a newline

' count-iteration each-expression

When run, we get a nice count up from 1 to 3072. It works! Similarly, I wanted to eyeball whether I got the right equations, so I wrote a number of different state-printing words, but I'll only show two here. First was .inputs, which prints the state of the input array. (It's conventional in Forth to start a string printing word's name with a dot, and to end a number printing word's name with a dot.)

  : .input  input@ fe. ;
  : .inputs 4 0 do i .input loop cr ;

.inputs loops over the indexes to the array and for each one calls i .input, which gets and prints the value. fe. prints a formatted float. Here's where I hit one of the biggest problems I'd have! This word prints the floats in their order in memory, which we might think of as left to right. If the array has [8, 6, 2, 1], we print that.

On the other hand, when we actually evaluate the expression, which we'll do a bit further on, we get the values like this:

4 0 do i input@ loop \ get all four inputs onto the float stack

Now the stack contains [1, 2, 8, 6]. The order in which we'll evaluate them is the reverse of the order we had stored them in memory. This is a big deal! It would've been possible to ensure that we operated on them the same way, for example by iterating from 3 to 0 instead of 0 to 3, but I decided to just leave it and force myself to think harder. I'm not sure if this was a good idea or just self-torture, but it's what I did.

The other printing word I wanted to show is .equation, which prints out the equation currently being considered.

  : .equation
    linear @
      0 .input 0 .op
        1 .input 1 .op
        (( 2 .input 2 .op 3 .input ))
      (( 0 .input 0 .op 1 .input ))
      1 .op
      (( 2 .input 2 .op 3 .input ))
    ." = " target f@ fe. cr ;

Here, we pick one of two formatters, based on whether or not we're doing linear evaluation. Then we print out the ops and inputs in the right order, adding parentheses as needed. We're printing the parens with (( and )), which are words I wrote. The alternative would have been to write things like:

  ." ( " 2 .input 2 .op 3 .input ." ) "

...or maybe...

  .oparen 2 .input 2 .op 3 .input

My program is tiny, so having very specialized words makes sense. Forth programmers talk about how you don't program in Forth. Instead, you program Forth itself to build the language you want, then do that. This is my pathetic dime store version of doing that. The paren-printing functions look like:

  : (( ." ( " ;
  : )) ." ) " ;

testing the equation

Now all we need to do is write something to actually test whether the equations hold and tell us when we get a winner. That looks like this:

  : check-solved
    this-solution target f@ 0.001e f~rel
    if .equation then ;

This is what we passed to each-expression at the beginning! We must be close to done now...

this-solution puts the value of the current expression onto the top of the (float) stack. target f@ gets the target number. Then we use f~rel. GNU Forth doesn't give you a f= operator to test float equality, because testing float equality without thinking about it is a bad idea, because it's too easy to lose precision to floating point mechanics. Instead, there are a bunch of float comparison operators. f~rel takes three items from the stack and puts a boolean onto the data stack. Those items are two values to compare, and an allowed margin of error. We're going to call the problem solved if we're within 0.001 of the target. If we are, we'll call equation. and print out the solution we found.

The evaluator, this-solution, looks like this:

  : this-solution
    4 0 do i input@ loop

    linear @ if
      2 op-do 1 op-do 0 op-do
      2 op-do
      frot frot
      0 op-do
      1 op-do

What could be simpler, right? We get the inputs out of memory (meaning they're now in reverse order on the stack) and pick an evaluation strategy based on the linear flag. If we're evaluating linearly, we execute each operator's execution token in order. If we're grouping, it works like this:

          ( r1 r2 r3 r4 ) \ first, all four inputs are on the stack
  2 op-do ( r1 r2 r5    ) \ we do first op, putting its result on stack
  frot    ( r2 r5 r1    ) \ we rotate the third float to the top
  frot    ( r5 r2 r1    ) \ we rotate the third float to the top again
                          \ ...so now the "bottom" group of inputs is on top
  0 op-do ( r5 r6       ) \ we do the last op, evaluating the bottom group
  fswap   ( r6 r5       ) \ we restore the "real" order of the two groups
  1 op-do ( r7          ) \ we do the middle op, and have our solution

That's it! That's the whole 24 solver, minus a few tiny bits of trivia. I've published the full source of the program on GitHub.

JSON::Typist (body)

by rjbs, created 2016-08-06 13:17

I've been meaning, for a while, to make a little post about a library I wrote a while ago.

Perl 5's type system is a mixed bag. Sometimes it's great, because you don't need to worry about types, and sometimes it's a pain, because you wish you could worry about types. There have been a number of proposals or attempts to sort this out over time, but basically nothing has happened. My guess is that not much is ever really going to happen, and that's okay. I still like Perl 5.

Sometimes, though, the lack of typing really does get in the way. In my experience, it's mostly when you need to deal with something outside of Perl that does have a strong distinction between numbers and strings. This can often be in the interchange of serialized data structures. JSON, for example, has three fundamental types that are more or less all muddled together: numbers, strings, and booleans.

When using JSON.pm, booleans can be produced by using \0 and \1, which is a bit weird, but ends up working really nicely. When read in, booleans become objects. Okay!

Strings and numbers can be produced by serializing "$x" or 0+$x directly, or by starting with string or number literals, which is maybe okay, but inspecting the data before it gets serialized can ruin this effect:

  ~$ perl -MJSON -E '$x = 0; say $x; say JSON->new->encode([$x])'

  ~$ perl -MJSON -E '$x = 0; say "$x"; say JSON->new->encode([$x])'

That say "$x" could always be buried deep in some subroutine, and you end up with spooky action at a distance.

Similarly, if you read in JSON and wanted to check what types the data had, you'd end up using B::svref_2object or other much-too-low-level tools. I wanted to be able to get objects back, just like I did with a boolean. I don't want this all the time, only sometimes, but when I want it, I want it!

I wrote JSON::Typist, which walks a structure produced by a JSON decode and returns a new structure, replacing string and number leafs with objects:

  my $content = q<{ "number": 5, "string": "5" }>;

  my $json = JSON->new->convert_blessed->canonical;

  my $payload = $json->decode( $content );
  my $typed   = JSON::Typist->new->apply_types( $payload );

  $typed->{string}->isa('JSON::Typist::String'); #true
  $typed->{number}->isa('JSON::Typist::Number'); # true

I'm using it for testing a web service that must provide data in the right types. It isn't enough to make sure that $data->{id} eq $expected, I also need to know that it was provided as a string. With JSON::Typist, I can.

I know this library needs some more work, and I need to build some test tools (maybe adding on to Test2::Compare?) to work with the structures I get back, but this has allowed me to test for (and then fix) a bunch of bugs, so I'm pretty happy with having gotten started.

HTML::MasonX::Free (body)

by rjbs, created 2016-05-24 21:45
tagged with: @markup:md journal

Who's ready to live in the past? Me!

Every time I try to like some other templating system in Perl, I fail. The only one I sort of like is Mason. No, no, not Mason 2. I don't like that. I like HTML::Mason. You know, Mason 1.

It has about a zillion problems, but the biggest problem, I think, is just its reputation. People think it's guaranteed to lead to some kind of awful "whole app written inside your templates," just because its original use case was "you can write our whole app inside your templates." But we believe in second chances, right?

For years now, I've wanted to write a Mason-inspired Mason replacement. I just haven't. I did, though, write a bunch of plugins to Mason to change how it behaved. They've made it a lot nicer to work with, and I thought I'd give a bit of a quick run-through on what they do. Maybe someone else will find them useful, although… well, I guess it could happen!

Stricter Component Interfaces

So, a typical Mason component might look like:


  <%method greeting>
  Hey, <% $name %>

    <head><title>Your face</title></head>
    <& SELF:body, name => $name &>

  <%method body>
    <div><& SELF:greeting, name => $name &><div>

  <!-- good night! -->

Even in this dumb contrived example, it can be hard to figure out the entry point. Basically, anything that isn't part of some other special block like <%method> or <%args> or <%def> is "the main thing that gets run." You could write your Perl programs like this, too, switching between the main code and subroutine definitions as you go, mixing them together, but you wouldn't. Right? No, you wouldn't.

Sometimes, we even encapsulate the main part of a program in sub main, like some other languages do. Then you run the program by calling main() at the end.

HTML::MasonX::Free::Compiler lets you do this with your Mason components. First, it forbids stray content. Everything must be inside a method or doc block (or similar structures), or the compiler barfs.

Then, when you render a component, there's a default method to call. So, if you call <& /some/component &> — which is what happens when you find and render a path — then it actually ends up calling /some/component:main. This forces a non-nesting structure where you're not interleaving a bunch of blocks inside of your main content.

Component Roots as Subclass Overlays

The Mason resolver maps component paths (which look like file paths) to components. In general it does that by looking through a file tree, but it can be more abstract, like in MasonX::Resolver::WidgetFactory. By default, though, it works like this:

Say you have three roots, /X and /Y and /Z. Then these two things exist:


…and then you call /vehicle/car.

Traditionally, Mason will look through the component and find the one in the first root. In this case, that's in /X. The component at /X/vehicle/car is then called. Calling (exec-ing) that component actually means walking up its ancestry to its inheritance root and calling that, which will then call $m->next until it gets back down to the actually-requested component.

This is nuts.

It made a bit of sense once upon a time when the default parent, autohandler was used for things like permissions checks. I'm only using Mason for templates, though, so forget that! I want to use inheritance in a more traditional way, for a more specialized version of a general thing. For this, I wrote HTML::MasonX::Free::Resolver. It gets a list of roots, but they're treated like overlays.

I'll elaborate. In the standard configuration, /X/vehicle/car can never have a parent under /Z. The default tree is:

    -> /X/vehicle/autohandler
      -> /X/autohandler

With HTML::MasonX::Free::Resolver, we'd get:

    -> /Z/vehicle/car

And while traditional Mason would call its tree from the bottom up, ours calls from the top down. Since all our components have a main method, then a pretty simple thing to do is to have this in the "base" template /Z/vehicle/car:

<%method main>
This is a <% SELF:color %> <% SELF:type %> car.

<%method color>grey</%method>
<%method type>motorized</%method>

…and in your "derived" template, /X/vehicle/car just:

<%method type>hybrid</%method>

This makes it easy to have a generic pack of templates that you customized on a per-install basis by adding a new root at the derived end of the list.

One fun fact: the component roots in Mason aren't stored in the resolver, but in the interpreter, even though the resolver is the thing that does the resolving. In order to have HTML::MasonX::Free::Resolver be in charge of its roots, you have to put a special value into comp_roots to indicate, "yes, I realize this won't ever get used."

HTML Entity Encoding with Fewer Screw-ups

Say your template has this:

  <input value='<% $value %>' />

Well, you'd never do that, right, because you'd use a widget generator? But let's pretend you would. The other bug is that you probably didn't escape the entites in $value, so maybe there's an HTML injection attack there. You might have wanted:

  <input value='<% $value |html %>' />

That weird-o pipe thing is Mason's filtering syntax. You probably almost always want to entity encode things, so you might set the default_escape_flags on your compiler to html. Then, when you don't wan't to encode, you do this:

  <div><% $known_html |n %></div>

This means, "no escaping for this, please." The problem is that you might want to write a method that accepts a parameter that could be of either type. There's no default way to know, and if you get it wrong, you're screwed up. You can find yourself in that situation in a number of ways.

HTML::MasonX::Free::Escape provides a replacement for the default html filter that can be given an argument that is known to be HTML. You generate it by using the html_hunk routine, like this:

  % my $text = "D&D";
  % my $html = html_hunk("D&amp;D");
  I like playing <% $text %> and more <% $html %>.

The rendered text will encode $text without double-encoding $html. You also can't accidentally do this:

  % my $html = html_hunk("D&amp;D");
  % my $string = "My favorite game is $html."

Or, rather, you can, but it will be a runtime error instead of a weird-o double encoding showing up somewhere.

That's it!

So, these don't really make Mason an amazingly modern thing, but help sand down a few of its most obvious warts, and that's been good enough for me!

Test::PgMonger, which you should probably not use (body)

by rjbs, created 2016-05-14 13:52

JMAP is a protocol that is meant to replace IMAP, CalDAV, CardDAV, SMTP, ACAP (ha ha), and probably some other protocols that aren't springing to mind. Like IMAP, it's meant to make it easy to synchronize offline work with an authoritative server. It does this by dividing up the data model into collections of discrete types, with each collection in a known and addressable state.

I'm not writing this post to talk about JMAP itself, though.

The JMAP model can be useful for things other than email, contacts, and that sort of thing. Why not make other things syncable in the same way? I've been writing a library to make this easy (or at least less difficult) to do. Given a DBIx::Class schema, my library Ix constructs a JMAP-like method dispatcher as well as a Plack application to publish it.

Ix is not even remotely ready for doing real work, so I'm not writing this post to talk about Ix, either.

Since an Ix application uses a database for storing all its entities, its test suite needed a database. I started out by using my usual strategy for testing simple database stuff: SQLite! I love SQLite. It is great. For each test, I could make a new SQLite database, deploy the DBIx::Class schema, and run tests. Then I'd delete the file. Done!

As the SQL that I was generating got more complex, I realized that using SQLite was no longer a good idea. It was great for getting started, but now I needed to run my tests against the same setup I'd have in production. I installed postgresql on my testing box and Postgres.app on my laptop. (By the way, have you seen Postgres.app? It runs Postgres, as you, on your Mac, just like a normal OS X app. It puts a elephant in the menu bar. Neat!) I still needed something to create and destroy my Postgres databases, though, since they weren't just files anymore.

I had a look at Test::Database, but it didn't do what I needed. I'll write (at least to BooK!) about the specific problems, but basically Test::Database's view of test databases is that they aren't nearly as single-use or disposable as what I wanted, and it wasn't easy to extend. Eventually, I wrote my own dumb little library, inspired by parts of Test::Database. It is called Test::PgMonger (pronounced "pig monger"), and it's stupid and effective.

The PgMonger object has credentials to PostgreSQL with permissions to create new users and databases. For now, I'm just assuming that localhost is trusted and I can use the postgres user. It uses those credentials to create a new user and a new database under a unique prefix. That database gets cleaned up at program exist, and there's a way to tell the PgMonger to kill all the databases that match its creating pattern, in case some escape deletion due to crashes or other screw-ups.

This is a really simple hunk of code, and even so it needs more refinement. Hopefully Test::Database can pick up the things I need so that I'm freed from thinking about this one-off thing. For now, though, this has made my testing really painless!

I went to the Perl QA Hackathon in Rugby! (body)

by rjbs, created 2016-04-26 22:45
last modified 2016-04-29 08:22

I've long said that the Perl QA Hackathon is my favorite professional event of the year. It's better than any conference, where there are some good talks and some bad talks, and some nice socializing. At the Perl QA Hackathon, stuff gets done. I usually leave feeling like a champ, and that was generally the case this time, too.

I flew over with Matt Horsfall, and the trip was fine. We got to the hotel in the early afternoon, settled in, played some chess (stalemate) and then got dinner with the folks there so far. I was delighted to get a (totally adequate) steak and ale pie. Why can't I find these in Philly? No idea.

steak and ale pie!!

The next day, we got down to business quickly. We started, as usual, with about thirty seconds of introduction from each person, and then we shut up and got to work. This year, we had most of a small hotel entirely to ourselves. This gave us a dining room, a small banquet hall, a meeting room, and a bar. I spent most of my time in the banquet hall, near the window. It seemed like the easiest place to work. Although there were many good things about the hotel, the chairs were not one of them! Still, it worked out just fine.

The MetaCPAN crew were in the dining room, a few people stayed at the bar seating most of the time, and the board room got used by various meetings, most of which I attended.

The view over my shoulder most of the time, though, was this:

getting to work!

Philippe wasn't always there with that camera, though. Just most of the time.

I think my work falls into three categories: Dist::Zilla work, meeting work, and pairing work.


Two big releases of Dist::Zilla came out of the QAH. One was v5.047 (and the two preceeding it), which closed about 40 issues and pull requests. Some of those just needed application, but others needed tests, or rework, or review, or whatever. Amusingly enough, other people at the QAH were working on Dist::Zilla issues, so as I tried to close out the obvious branches, more easy-to-merge branches kept popping up!

Eventually I got down to things that I didn't think I could handle and moved on to my next big Dist::Zilla task for the day: version six!

My goal with Dist::Zilla has been to have a new major version every year or two, breaking backward compatibility if needed, to fix things that seemed worth fixing. I've been very clear that while I value backcompat quite a lot in most software, Dist::Zilla will remain something of a wild west, where I will consider nothing sacred, if it gets me a big win. The biggest change for v6 was replacing Dist::Zilla's use of Path::Class with Path::Tiny. This was not a huge win, except insofar as it lets me focus on knowing and using a single API. It's also a bit faster, although it's hard to notice that under Dist::Zilla's lumbering pace.

Karen Etheridge and I puzzled over some encoding issues, specifically around PPI. The PPI plugin had changed, about two years ago, to passing octets rather than characters to PPI, and we weren't sure why. Karen was convinced that PPI did better with characters, but I had seen it hit a fatal bug that using octets avoided. Eventually, with the help of git blame and IRC logs, we determined that the problem was... a byte order mark. Worse yet, a BOM on UTF-8!

When parsing a string, PPI does in fact expect characters, but does not expect that the first one might be U+FEFF, the space character used at offset 0 in files to indicate the UTF encoding type. Perl's UTF-16 encoding layers will notice and use the BOM, but the UTF-8 layer will not, because a BOM on a UTF-8 file is a bad idea. Rather than try to do anything incredibly clever, I did something quite crude: I strip off leading U+FEFF when reading UTF-8 files and, sometimes, strings. Although this isn't always correct, I feel pretty confident that anybody who has put a literal ZERO WIDTH NO-BREAK SPACE in their code is going to deserve whatever they get.

With that done, a bunch of encoding issues go away and you can once again use Dist::Zilla on code like:

my $π = 22 / 7;

This also led to some fixes for Unicode text in __DATA__ sections. As with the Path::Tiny change, a number of downstream plugins were affected in one way or another, and I did my best to mitigate the damage. In most cases, anything broken was only working accidentally before.

Dist::Zilla v6.003 is currently in trial on CPAN, and a few more releases with a few more changes will happen before v6 is stable.

Oh, and it requires perl v5.14.0 now. That's perl from about five years ago.


I was in a number of meetings, but I'm only going to mention one: the Test2 meeting. We wanted to discuss the way forward for Test2 and Test::Builder. I think this needs an entire post of its own, which I'll try to get to soon. In short, the majority view in the room was that we should merge Test2 into Test-Simple and carry on. I am looking forward to this upgrade.

Other meetings included:

  • renaming the QAH (I'm not excited either way)
  • using Test2 directly in core for speed (turns out it's not a big win)
  • getting more review of issues filed on Software-License

A bit more about that last one: I wrote Software-License, and I feel I've done as much work on it as I care to, at least in the large. Now it gets a steady trickle of issues, and I'm not excited to keep doing it all myself. I recruited some helpers, but mostly nothing has come of it. I tried to rally the troops a bit to encourage more regular review just leading to each person giving a +1 or -1 on each pull request. Otherwise, Software-License is likely to languish.


I really enjoy being "free floating helper guy" at QAH. It's something I've done a lot ever since Oslo. What I mean is this: I look around for people who look frustrated and say, "Hey, how's it going?" Then they say what's up, and we talk about the issue. Sometimes, they just need to say things out loud, and I'm their rubber duck. Other times, we have a real discussion about the problem, do a bit of pair programming, debate the benefits of different options, or whatever. Even when I'm not really involved in the part of the toolchain being worked on, I feel like I have been able to contribute a lot this way, and I know it makes me more valuable to the group in general, because it leaves me with more of an understanding of more parts of the system.

This time, I was involved in review, pairing, or discussion on:

  • fixing Pod::Simple::Search with Neil Bowers
  • testing PAUSE web stuff with Pete Sergeant
  • CPAN::Reporter client code with Breno G. de Oliveira
  • DZ plugin issues with Karen Etheridge
  • Log::Dispatchouli logging problems with Sawyer X
  • PAUSE permissions updates with Neil Bowers
  • PAUSE indexing updates with Colin Newell (I owe him more code review!)
  • improvements to PAUSE's testing tools with Matthew Horsfall
  • PPI improvements with Matthew Horsfall

...and probably other things I've already forgotten.

more hard work

Pumpking Updates

A few weeks ago, I announced that I'm retiring as pumpking after a good four and a half years on the job. On the second night of the hackathon, the day ended with a few people saying some very nice things about me and giving me both a lovely "Silver Camel" award and also a staggering collection of bottles of booze. I had to borrow some extra luggage to bring it all home. (Also, a plush camel, a very nice hardbound notebook, and a book on cocktails!) I was asked to say something, and tried my best to sound at least slightly articulate.

Meanwhile, there was a lot of discussion going on — a bit at the QAH but more via email — about who would be taking over. In the end, Sawyer X agreed to take on the job. The reaction from the group, when this was announced, was strong and positive, except possibly from Sawyer himself, as he quickly fled the room, presumably to consider his grave mistake. He did not, however, recant.

perl v5.24.0

I didn't want to spend too much time on perl v5.24.0 at the QAH, but I did spend a bit, rolling out RC2 after discussing Configure updates with Tux and (newly-minted Configure expert) Aaron Crane. I'm hoping that we'll have v5.24.0 final in about a week.

Perl QAH 2017

I'm definitely looking forward to next year's QAH, wherever it may be. This year, I had hoped to do some significant refactoring of the internals of PAUSE, but as the QAH approached, I realized that this was a task I'd need to plan ahead for. I'm hoping that between now and QAH 2017, I can develop a plan to rework the guts to make them easier to unit test and then to re-use.

Thanks, sponsors!

The QAH is a really special event, in that most of the attendees are brought to it on the sponsor's dime. It's not a conference or a fun code jam, but a summit paid for by people and corporations who know they'll benefit from it. There's a list of all the sponsors on the event page, including, but not limited to:

Raise a glass to them!

Dist::Zilla v6 is here (in trial format) (body)

by rjbs, created 2016-04-24 04:38

I've been meaning to release Dist::Zilla v6 for quite a while now, but I've finally done it as a trial release. I'll make a stable release in a week or two. So far, I see no breakage, which is about what I expected. Here's the scoop:

Path::Class has been dropped for Path::Tiny.

Actually, you get a subclass of Path::Tiny. This isn't really supported behavior. In fact, Path::Tiny tells you not to do this. It won't be here long, though, and it only needs to work one level deep, which it does. It's just enough to give people downstream a warning instead of an exception. A lot of the grotty work of updating the internals to use Path::Tiny methods instead of Path::Class methods was done by Kent Frederic. Thanks, Kent!

-v no longer takes an argument

It used to be that dzil test -v put things in verbose mode, dzil test -v Plugin put just that plugin in verbose mode, and dzil -v test screwed up because it decided you meant test as a plugin name, and then couldn't figure out what command to run.

Now -v is all-things-verbose and -V is one plugin. It turns out that single-plugin verbosity has been broken for some time, and still is. I'll fix it very soon.

Deprecated plugins deleted

I've removed [Prereq] and [AutoPrereq] and [BumpVersion]. These were long marked as deprecated. The first two are just old spellings of the now-canonically-plural versions. BumpVersion is awful and nobody should use it ever.

PkgVersion can generate "package NAME VERSION" lines

So, now you can avoid deciding how to assign to $VERSION and add the version number directly to the package declaration. This also avoids the need to have any room for blank lines in which to add $VERSION.

Dist::Zilla now requires v5.14.0

Party like it's 2011.

the "credit the last uploader" problem (body)

by rjbs, created 2016-02-12 09:16
last modified 2016-02-12 09:40
tagged with: @markup:md cpan journal perl

First, a refresher…

At its simplest, the CPAN is a bunch of files and an index. The index directs you from package names to the files that contain the latest authorized release of that package. Everything else builds on top of that.

If you want to publish Foo::Bar to the CPAN, you need to use PAUSE. PAUSE manages users and permissions, authenticates users, accepts uploads, and then decides how and whether to index them. To make those indexing decisions, first PAUSE analyzes an uploaded file to see what packages it contains. Then it compares those packages to the permissions of the uploading user. If the user has permission, and if the uploaded package is later-versioned than the existing indexed package, the package is indexed.

I have skipped some details, but I believe that for the purpose of everything else I'm going to write about, this is a sufficient explanation.

To get permissions on a package that isn't indexed at all, you upload it. Then you have permissions. If you want to work with a package that already exists, the person who uploaded it needs to give you permission. There are two kinds of permission:

  • first-come; you're the person who first uploaded it, or the person to whom that person has handed over the keys; there is only one first-come user per package; you can upload new versions and you can assign and revoke co-maint permissions
  • co-maint: you are permitted to upload new versions, but you may not alter the permissions of the package

The Complaint

When you view code on MetaCPAN or search.cpan.org, one of the most visible details is the name (and avatar) of the last user to have uploaded that package. This creates a strong impression that this is the contact point for the package. Sometimes, this is true, or true enough. On the other hand, sometimes it's not, and that's a problem. It may be that the last person to upload the library only did so as a one-off act, or that they were a member of the team working on a project years ago when it was last released. Now, though, they will be boldly listed as the contact person.

Here's a scenario:

  • in 2002, a library, Pie::Packer is uploaded by Alice and is popular for a while
  • in 2008, Bob finds a bug and finds that Alice isn't really working on Perl anymore; Bob offers to do a release for just this bug fix
  • Alice gives Bob co-maint on Pie::Packer
  • Bob uploads Pie::Packer v1.234, the only release he ever plans to make
  • from 2008 through 2016, Bob is sent requests for help with Pie::Packer

Bob can't just pass on permissions to stop it. He can give up permissions, but he'll still be the last uploader.

You might object: "Alice should have given Bob first-come! Then he could pass along permissions!"

This is true. Maybe in 2010, Bob gives permissions to Charlotte... but now Charlotte is stuck in the same position. If nobody ever comes along to take it over, Charlotte can't usefully get out from under the distribution.

Half a Solution

In 2013, the QA Hackathon led to a consensus about a mechanism for permission transitions. It goes something like this:

  • give user "ADOPTME" co-maint to indicate that first-come permissions can be given to someone who wants them, and you don't need to be consulted
  • give user "HANDOFF" co-maint to indicate that you're looking to pass along first-come to someone else, but they should go through you

(The third magic user, "NEEDHELP," is not relevant to the topic at hand.)

Marking a library with ADOPTME or HANDOFF is useful in theory, but not in practice, because it's almost impossible to know that it has happened. Yesterday, I filed a bug about making ADOPTME/HANDOFF visible on MetaCPAN, and I think it's critically important to making the ADOPTME/HANDOFF worth having.

So, why is this section headed "half a solution"?

Because this solution helps you if you have first-come, but not if you have co-maint. Imagine poor Bob, above, in 2016. By this point, Alice has moved off the grid and can't be contacted. Bob can't mark the dist as ADOPTME. He can ask the PAUSE admins to do so, but that's it. It's also a bit a burden to put onto the PAUSE admins, who may not know whether Bob has really made a good faith effort to contact Alice.

The final remaining problem is this: There is no escape hatch for someone who has co-maint permissions and wants to get out from under the shadow of an unwanted upload.

The Simplest Thing That Could Possibly Work

This problem could be solved by adding a "GitHub Organizations"-like layer to PAUSE… but I think there's a much, much simpler mechanism.

We should always treat the first-come owner as the authoritative source, including when displaying a distribution on the web. MetaCPAN Web should stop showing the name and image of the latest uploder as prominently, and should show the first-come user instead. The same goes for search.cpan.org and other such sites. MetaCPAN already has a place for listing other contributors, which should contain the last uploader. Adding note like "last upload by BOB" seems okay, too, but the emphasis should be on connecting the distribution with the one person who can actually make decisions about its future.

The Great Infocom Replay: Infidel (body)

by rjbs, created 2015-10-22 23:31

These replay write-ups get shorter and shorter as I go. I think it's because I'm growing more and more confident in what I like and don't like, and what I will and won't spend my time on. Infidel has a nice setup. I liked the setting, the starting plot, and the way the game got started. Soon enough, though, I got a "you're quite thirsty" message, and I groaned.

I found water and food and decided I'd stick to it. I could suck it up and deal with hunger puzzles. Then I got to maze. Forget it!

There really was only one puzzle on the way to the maze, at least on the route that I took to get there. Maybe the rest of the puzzles in the game are great, but I'm not going to find out. At least, not by playing the game. That's part of my problem with the replay, now. Because I'm willing to stop playing a game when I see that there's a maze, I don't see all the good stuff on the other side of it. If it was just a maze, I might pull out a map and skip the maze, but the maze and the hunger puzzle together are lethal.

I'm going to have to find some transcripts of full game plays, maybe.

The biggest killer to my enjoyment of games with these elements is that I find myself making a map with no real immersion, just planning out how to solve the puzzles faster on my next run. I complained about this [last time, talking about Enchanter]. I much prefer when I can make the map while I play, never worrying about figuring out a critical path. (And yet, I love Suspended.)

Next up is Sorcerer.

a big ol' Catalyst upgrade (body)

by rjbs, created 2015-09-30 23:45
last modified 2015-10-01 17:18

At work, a fair bit of our web stuff is Catalyst. That's not just the user-facing website, but also internal HTTP services. For a long time, we were stuck on v5.7012, from late 2007. That's pre-Moose (which was 5.8000) and pre-Plack (which was 5.9000). It wasn't that we didn't want to upgrade, but it was a bunch of work and all the benefits we'd see immediately were little ones. It was going to free us up for a lot of future gain, but who has the time to invest in that?

Well, I'd been doing little bits of prep work and re-testing over time, and once in a while I'd see some plugin I could update or feature I could tweak, but for the most part, I'd done nothing but repeatedly note that upgrading was going to take work. A few weeks ago, I decided to make a big push and see if I could get us ready to upgrade. This would mean upgrading over eight years of Catalyst… but also Moose. We were running Moose v1.19, from late 2010.

The basic process involved here was simple:

  1. make a local::lib directory for the upgrade
  2. try to install Catalyst::Devel and Catalyst::Runtime into it
  3. sort out complications
  4. eventually, run tests for work code
  5. sort out complications
  6. deploy!

So, obviously the interesting question here is: what kind of stuff happened in steps 3 and 5?

Most of this, especially in step 3, was really uninteresting. Both Catalyst and Moose will tell you, when you upgrade them, that the upgrade is going to break old versions of plugins you had installed. So, you upgrade that stuff before you move on. Sometimes, tests would fail because of unstated dependencies or bugs that only show up when you try using a 2015 modules on top of a 2007 version of some prereq. In all of this I found very little that required that I bug Catalyst devs. There was one bug where tests didn't skip properly because of a silly coding mistake. Other than that, it was mostly an n-step process of upgrading my libraries.

The more complicated problems showed up in step 5, when I was sorting out our own code that was broken by the update. There wasn't much:

  • plugins written really fast and loose with the awful Catalyst "plugins go into your @ISA" mechanism
  • encoding issues
  • suddenly missing actions (!)
  • deployment issues

In general, we fixed the first by just dropping plugins that we no longer needed. The only plugin that really mattered was the one that tied Catalyst's logging system to our internal subclass of Log::Dispatchouli::Global, and that was replaced by roughly one line:

Pobox::Web->log( Pobox::Web::Logger->new );

So, we killed off a grody plugin and replaced it with a tiny wrapper object. Win!

I also had to make this change to our configuration, which seemed a bit gratuitous, but the error message was so helpful that I couldn't be too bothered:

-view: 'View::Mason'
-default_view: 'View::Mason'
+view: 'Mason'
+default_view: 'Mason'

Encoding issues ended up being mostly the same. We dropped the Unicode plugin and then killed off one or two places where we were fiddling with encodings in the program. Honestly, I'm not 100% sure how Catalyst's old and new behavior are supposed to compare, but the end result was that we made our code more uniformly deal in text strings, and the encoding happened correctly at the border.

The missing actions were a bigger concern. What happened?!

Well, it turned out that we had a bunch of actions like this:

sub index : Chained('whatever') Args(0) { ... }

These were meant to handle /whatever, and worked just fine, because in our ancient Catalyst, the index subroutine was still handled specially. In the new version, it was just like anything else, so it started handling /whatever/index. The fix was simple:

sub index : Chained('whatever') Args(0) PathPart('') { ... }

Deployment issues were minor. We were using the old Catalyst::Helper scripts, which I always hated, and still do. Back in the day, and in fact before Catalyst::Helper existed, Dieter wrote what I considered a much superior system internally called Catbox… but we never really polished it up enough for general use. I regenerated the scripts, but this was a bit of a pain because we'd made internal changes, and because the helper script generator doesn't act nicely enough when your repo directory name doesn't quite match your application name. I got it worked out, but it didn't matter much, because of Plack!

I had been dying to get us moved to Plack for ages, and once everything was working to test, I replaced the service scripts with wrappers around plackup. I mentioned this to #plack and got quoted on plackperl.org:

"Today, I finished a sizable project to upgrade almost all of our web stuff to run on Plack. Having done that, everything is better!"

It was true. I replaced old Catalyst::Engine::HTTP::Prefork with Starman and watched the low availability reports become a trickle. I've moved a few things to Starlet since. (I couldn't at the time because of a Solaris-related bug, since fixed in Server::Starter.)

The Moose upgrades were similarly painless. The main change required was in dealing with enum types, which now required that anonymous enumerations had their possible values passed in in a reference, rather than the old, ambiguous list form. Since I was the one who, years ago, pushed for that change, I couldn't get upset by having to deal with it now.

All told, I spent about three work days doing the upgrade and then one doing the deploy and dealing with little bugs that we hadn't detected in testing. It was time well spent, and now we just have one last non-Plack service to convert… from HTML::Mason.

The Great Infocom Replay: Enchanter (body)

by rjbs, created 2015-09-23 23:28

I think I'm officially giving up on beating Enchanter, but it has been a pretty interesting experience as far as my big replay goes. It's not because the game is great, but because it has allowed me to get a better handle on what I don't like about the early Infocom games.

I found the writing to have the same sort of charm as other Infocom games. It's very economical, but sometimes succesfully both wry and whimsical. It has a lot of the same problems as other games of the period, though. The two that gutted me: hunger and mazes. "You are getting hungry" is one of my least favorite things to see in a game. It is a guarantee that I will end up dead and have to figure out how to replay the game in fewer moves, next time. Similarly awful is the feeling of leaving "The Transparent Room" only to find myself in "The Transparent Room" with a slightly different descrpition. It means I'm going to waste half an hour figuring out a maze using the "gather a bunch of items and drop them in different rooms" technique. It's not fun.

I managed to make a pretty good map, though, by marking all the unknown exits and the continually restoring to speed run through each one. This, I think, is when I realized that I was not going to love Enchanter. The idea of the game is fine. I like the magic in it. I just didn't feel like it was much of a story, because I kept having to go back and do it all over again. I realized that this is my problem. I want these games to be stories, not puzzles, but they are fundamentally puzzles, with a veneer of story around them.

On the other hand, I can look at my two favorite Infocom games (so far): Suspended and A Mind Forever Voyaging. Suspended is almost pure puzzle. The story is barely there. At least, that's how it seems to me, although I'm not sure why. When I have to restart in Suspended, I don't feel like I'm breaking apart a story with a rising action and climax. It's just a puzzle box. A Mind Forever Voyaging is almost pure story. There are a few puzzles, but they're pretty simple. For the most part, the game is a world that you explore, and the puzzles are there to motivate you to do so.

When I made this realization, I really accepted the new way I was playing Enchanter: I didn't try to enjoy the story anymore, I just cataloged roomed and objects, trying to piece together the critical path for the game in my head in a big flow diagram. This suddenly seemed like the right way to play the game, and I thought I might try to finish it, since I had this new handle on it. Then again, I thought, I wasn't having much fun. Given that it has taken me two and a half years to get through nine game replays, it seemed foolish to spend longer on this one than I'd enjoy.

Next up, Infidel.

I won a NAS! (body)

by rjbs, created 2015-09-13 23:38
tagged with: @markup:md hardware journal

Last year, I bought a Synology ds214play NAS. I posted about my horrible data migration, wherein I lost a whole ton of data, entirely because I was a bonehead. Despite that pain, I absolutely loved the Synology NAS. It frequently impressed me with how much it could do, how well it did it, and how easy it was to do. Even after I moved all of my media to it, all my old backups, and started using it as a Time Machine backup destination, I had a good terabyte of space left.

Obviously, that meant I spent the whole last year thinking, "I should get a bigger NAS!" I seriously considered doing so, figuring out how much I could sell the old one for. At every turn, though, I'd remind myself that this was totally crazy. I had no need for a bigger NAS. It would cost money, take time to move to, and have no actual benefit. Maybe when I got low on space, it would make sense, but planning too far in advance for that would just mean that I'd pay too much for the drives.

On the other hand, when I saw an Engadget raffle to win a Synology ds415play, I broke my usual habit of ignoring all online contests as unwinnable scams. I figured it couldn't hurt, especially since I gave a tagged email address. I followed Synology on Twitter and visited their Facebook page to earn two extra entries. Then, I forgot about it. A few days later, though, I got an email telling me that I won. I put on my best skeptical face, but a few days later I got some legal papers from FedEx, and not too much later, I got the NAS!

It was a four bay NAS with four 4T WD reds. That's 12T of storage in a RAID5. What would I do with all that space? Who cares, it was free! Unfortunately, it arrived the day before I left for YAPC::Asia, so I didn't have a lot of time to get things cut over. Remembering how badly things worked out when I rushed last time, though, I decided to wait until I got back. It ended up taking me much longer than I wanted to transition, but now that I'm done, everything is great.

Problems I encountered:

The default instructions for migration involve yanking the drives from the old NAS and inserting them to the new one. If anything went wrong there, I'd be hosed, so first I wanted to make backups. I didn't have any drives large enough to store all the data, though. I couldn't just yank the drives from the new NAS, because my USB enclosures couldn't address 4T at once, and I'm sure I could've done something more complicated, but ugh. Instead, I thought, I'd just use the network. After all, the two NASes were on a gigabit ethernet link.

I did an rsync from one device to the other, using the Synology backup service. It took almost exactly twenty-four hours. This is Synology's suggestion when you're doing a network migration. Unfortunately, once I finished the rsync and tried to restore the data onto the working space of the new NAS, it told me "no backups detected." Waah? I asked Synology support, and their response was, "please email us the admin password to your NAS." I was pretty uninterested in this and asked for a second opinion. Eventually they suggested something else, sort of half-heartedly, but it took days to hear back, and by then I'd already taken action. In fact, I'd taken the action they suggested, and it worked.

With the rsync to the backup area done, all I had all the files on the new NAS. I just copied things into place, did a couple chowns, migrated only the system settings, and I was done. Maybe it would have been more convenient to have had the restore work, but it couldn't have been much easier. Once I'd fixed the file owners, only one thing didn't work: Time Machine. OS X just refused to believe that the "sparse bundle" disk image was valid. I couldn't find any good explanation, so I decided that it was probably a problem with extended attributes "or something." I blew away the new copy and rsync-ed again, this time using OS X's patched rsync (and -S). It worked. Why? I'm not quite sure, but who cares, right?

While getting to this point, I had a bunch of false starts and actually fully copied my data from the old to new NAS more than once. Each time, this was my fault, and nobody else's. The best reason for recopying everything was to change my RAID type. The device arrived formated in SHR-1, which is an enhanced RAID5. I reconfigured it as SHR-2, which is more like RAID6. Sure, it cost me 4T of space, but now I can lose two drives. This is a much more useful benefit for me. If I was going to spend a year longing for a four-bay NAS, it should've been because I could have two drive redundancy, not for a lot of space that I didn't need.

I also had fun yanking and re-inserting a drive. I knew it would work, and then it took ages to recheck the data integrity, but it's just fun, and I have no regrets.

I'm looking forward to getting even more out of my Synology. It can run quite a few useful network services, and I'm only using one or two of them. A Synology really does work at providing all the "cloud" services that a typical user might want, but privately in your own house. There are a few such services that I use that I'm looking to stop using. I will post on those successes as they occur.

Engadget did not ask me to say anything about the contest or the device, nor did anybody else. I really do like the Synology NAS line. Thanks for choosing such a great winner, Engadget!

YAPC::Asia, days 3-4 (body)

by rjbs, created 2015-09-06 22:58
tagged with: @markup:md journal travel yapc

YAPC actually only runs two days, or three if you count "RejectConf" on day zero. So, this entry is not really about YAPC::Asia, but about what I did between the end of the conference and my trip home.

I woke up at the YAPC hotel, got some breakfast (fish and okayu!) and packed my bags. The plan was to visit Nikkō with Shawn, Dustin, and Karen, and my first step was to get to Kita-Senju Station, one of the busiest stations in the Tokyo Metro system. This was easy (although I was flagged for not having paid properly on my last trip, and had to pay more). The complicated part came when I had to find Shawn and Dustin, then Karen, and then get seated. Karen and I ended up in a different car from Shawn and Dustin, each of whom was seated in his own car. I was seated directly in front of Karen, but moved back to be able to speak with her. Doing this seemed to put the conductor in a very awkward situation. He apologetically reseated us at the rear of the nearly empty train, although nobody ever seemed to take the seat I'd taken, or really to seat near where we'd been at all. I was curious, but it seemed best not to ask.

Nikkō was a pretty little town. Although its population is ninety thousand, it seemed very much like a one-street town. We dropped our bags at the hotel and headed up the hill toward the temples and shrines. Partway there we stopped to get lunch, but after the party in front of us was admitted to the restaurant, the "closed" sign came out. "Sorry," said the restaurateur. We moved on and stopped into a little two-table mom-and-pop place where I had a totally okay bowl of udon.

lunch in Nikko

After lunch, we finished our trek uphill and eventually got to the site. My first order of business was to down a bottle of Pocari Sweat. Sure, it was grey, overcast, and drizzly out, but it was also hot and muggy, and I was dehydrating even though it I was walking through a mist. After that, I took some time to look at the site, and it was quite impressive. We were briefly worried when we saw that one of the main buildings had been entirely surrounded by a temporary and very ugly building during restoration. Fortunately, though, it was the only one. We continued on into the complex and found dozens of beautiful buildings and paths.

Unfortunately for me, I had no idea what we were seeing. I had skipped out on the ¥500 audio program, and all the signs were in Japanese. I had it in my head that it was all Buddhist, and specifically related to Nichiren, but I later realized that I had confused Nikkō the town with Nikkō the priest. Although this was an embarassing confusion, I was at least pleased that my memory had held on to any of this material! In fact, the site is a combination of Buddhist and Shinto structures, which is pretty reflective of Japanese religious tradition. It was sometimes clear what was what, but at other times I had no idea just what I was seeing. If I went back, I'd get the audio guide.

Nikko shrines and buildings

One thing that I didn't capture well in any of my photos is the verticality of the site. Between sets of buildings there were many long flights of stairs. The stairs were slick, wet stone without handrails. Once or twice, someone above us would slip or drop something, and there'd be a moment of panic as I wondered whether we were all about to go down like tenpins. Fortunately, it never happened. It felt like we'd made it quite a ways up, but there was more to go. Eventually, we decided we'd gone high enough, and headed back down. That was just as harrowing, but a bit less exhausting.

On the way back down the hill toward our hotel, I decided that I wanted to get ice cream, but we never saw anything that captured our interest. I was hoping for something slightly more interesing than random soft serve. In the end, I ended up getting dango, which I'd seen on the way up, and was puzzled by. I had no idea what it was, even when I ordered it. I think that what I got was [mitarashi dango](Mitarashi dango). Once I tasted it, I understood. It was three balls something made from rice, a lot like mochi, painted with something very much like the sauce on unagi. It was quite tasty, although I think I would've preferred it without the sauce, or with something sesame-based. I'd definitely order it again, though.

Also on the way back, we saw monkeys! There was a group of many six monkeys, one of them carrying a baby on its belly, and they crossed the road on the electrical wires and vanished onto the nearby rooves. Woah.

Karen headed back to Tokyo, and the rest of us went to our room. We were all exhausted. There was a brief chance that we'd be social when I went looking for a beer vending machine, but it turned out the hotel vending machine only had soft drinks. I thought maybe I'd try to get just a little more work done, but bad luck struck: my laptop was busted again. Just like several weeks earlier, the keyboard and trackpad were no longer recognized at all. I couldn't log in or do anything else. I made sure I got my 3DS and iPad charged for my upcoming trip back.

I fell asleep at some absurdly early hour and woke up around 3:30, when Shawn also woke up and snuck out to do some walking around. I stayed in bed to try to get some more sleep, but had no luck. Eventually, I headed to the Tobu-Nikko station to head into Tokyo and toward Narita. I got my ticket and made it to Tokyo station without human assistance, and I felt quite proud of that! (To be fair, I got a bit muddled getting from Kita-senju to Tokyo station, but I did oaky.)

My goal was to deposit my bags at a coin locker in Tokyo station, to visit Tokyu Hands and the Pokémon store, and then to head to Narita. The problem was that while both shops gave some basic instructions on how to find them, they were both located in huge shopping centers, and those two shopping centers were connected to one another. Nearly all the signs were in Japanese. GPS was no help. All the coin lockers were taken, so I was hauling my luggage everywhere. I nearly gave up on finding anything, but eventually I accomplished my missions: gifts for Martha and Gloria… and for me. It's almost two weeks later, and Martha is still her Ditto everywhere, though, so it was worth it.

The trip to the airport was fine. Once there, I had lunch with Marylou, who was flying out near the same time. We had tempura. I've never been a big fan of tempura, but all the tempura that I had on this trip made me think I should give it another chance. I killed a lot of time in the ANA Narita lounge, but I found it pretty underwhelming compared to the Air Canada lounge I'd visited in Toronto. Then again, I'd later find myself in a different Air Canada lounge in Toronto that was a dump comapred to the Narita lounge. Each one, though, amused me with its beverage selection. In Tokyo, they had pitchers full of Pocari Sweat. In Toronto, they had cans of Fruitopia, which I thought had been lost to history fifteen years ago.

It seems like this is probably the last YAPC::Asia in Tokyo, at least at this scale, or at least for a while. That's too bad, because each one I went to was excellent, and I was always honored to be asked and delighted to accept. The up side might be that some other amazing place picks up the YAPC::Asia name. How about Manila? Hong Kong? Taipei (again)? I'll be interested to see what happens next, and whether the next YAPC::Asia is another "festival for engineers," or a more purely Perl-o-centric conference. All I hope, though, is that the attendees have a good time, learn some useful things, and leave with some good feelings for the people and maybe for the language, too. I suppose I also wouldn't mind an invitation to speak!

Visiting Japan was excellent, as it was on my previous visits. The best thing about YAPC::Asia in Tokyo being over is that the next time I go to Tokyo, it will have to be as a holiday, with my family, with nothing to do but enjoy the city. As long as I can manage to get onto Tokyo time, I think that will come quite easily.

Until then, I'll have to make do with the notebooks I brought back from Tokyu Hands, the recipe for okayu that I found on Google, and the really excellent Terakawa Ramen in Philly's Chinatown.

YAPC::Asia, day 2 (body)

by rjbs, created 2015-09-06 17:16
last modified 2015-09-06 17:18
tagged with: @markup:md journal travel yapc

I woke up early again on the 22nd, some time before dawn, and did some preparation for the conference. It was my day for speaking, and I wanted to make another pass through my slides. Eventually, it was six thirty and I decided to figure out what I could do for breakfast other than Starbucks. I went to look up "breakfast near Sunroute Ariake Hotel" and found a bunch of reviews in which people praised the hotel's breakfast. Marylou had said it was just soup and bread, and that's when I realized my error: I had trusted someone from Pittsburgh.

It turns out that the other hotel restaurant, on the second floor, had a full buffet. It was just like the one in Shinjuku, but with a better selection and worse music. I'd been sad that there was no fish on the menu in Shinjuku, but had enjoyed the all-Beatles playlist. There was an endless supply of broiled Pacific saury at Ariake, but the music was all music box versions of 1980's hits. I think I listened to a ten-minute long version of "Take My Breath Away" while eating. I ate plenty of fish, a lot more okayu, and probably too much of other things. I am a big fan of the hotel breakfast buffet!

At the conference, I got to work adding a few sections to my slides. I especially wanted to discuss how some of the Unicode changes worked with Japanese text, but I didn't get to add everything that I'd wanted. The short version of it was this: \b isn't very useful for Japanese text, and '\b{wb}` isn't much better.

Before I finished my last-minute editing, it was time to meet with the translators. In previous years, I'd been asked to supply my slides a week or two in advance. The YAPC volunteer translators would then, a few days later, send me a text file of Japanese subtitles, which I'd add to my slides (at the airport). I viewed this as an amazing service, and a lot of work that probably deserved more thanks than I remembered to give.

This year, though, the team went well beyond that. Instead of getting slides subtitled, they had live simultaneous translation provided by professional translators. I had a scheduled meeting, an hour long, an hour before my talk. I figured this was sort of a window of time: at some point during it, I'd be called in for a little while, and that would be that. Was I ever mistaken! I had been asked, a few weeks ahead of the conference, to provide my slides. I sent them what I usually give for my slides: a build-by-build PDF. Each addition of a bullet point was a new page, so you could easily step through the talk to see how it would be presented.

When I got to my meeting, the translators had a printout. I was aghast! It was something like four hundred pages, white on black. The translators were somewhat stunned, too. "Are you really going to do this many slides?"

Fortunately, we cleared things up pretty quickly. They'd made printouts to annotate with notes to help with the more technical details. The translators were technical, but they needed information on some details that they weren't familiar with. (This made things fun. One of them asked me, "Why would you ever want hex float literals?" and I got to explain.) They also inflate my ego a little by telling me that they'd watched a video of my talk and found me to be an interesting speaker. I wasn't quite sure what to think, though, when they elaborated that it was because I "liked to trick, and lie to, the audience." I understood, though, that they were a little worried about my constant nonsensical digressions and jokes. They were also worried by my incorrigable punning. They also were intrigued by the word "backwhack" for "backslash." It was a fun meeting.

The meeting actually took almost the full hour, and left me with just enough time to clean up the slides that I'd left in progress, drink some water, and otherwise get prepared to speak. I think the talk went well. I tried to avoid too many verbal embellishments and puns. You can decide how I did for yourself, because the talk is up on YouTube. I was really curious how the translation went, because even at my slowed-down speed, I spoke fairly quickly. So far, all I've heard were good things. It must have been hard work. Later, someone told me that the each translator would only do a small amount of work each day, because it was so taxing. I bet!

After that, I was here and there, chatting, recovering, and otherwise enjoying the hallways. Bento box lunch was provided again, and this time speakers were given the choice of several fancier boxes. I picked one more or less at random, and it was tasty.

conference day 2: bento

After lunch, I saw Miki and Marty give a talk on how containers work, including "how to write your own containers in Perl." It was excellent, and was among my favorite kinds of talks. It took a topic that many people know about and then explained the underpinnings that most people don't understand. I'm a big fan of these talks, because they demystify things that programmers use all the time without really understanding. Using things you don't understand can be convenient, but it makes it really hard to act rationally when things go wrong. I think I'll probably re-watch this talk on YouTube sometime when I'm more alert than I was in Tokyo.

Marylou gave her talk on posture, which was good, but I wish I could've seen things being demonstrated from way, way in the back. I was reminded that the Japanese seem to be excellent at squatting, and that maybe I should work on that myself. After that, I caught the second half of Jonathan Worthington's talk on concurrency in Perl 6, which was quite interesting. Brad Fitzpatrick gave a talk on profiling and optimization with Go, which did a good job of showing off the Go tooling, much of which I hadn't seen before. It was impressive.

I attended the lightning talks, the same as I had the day before. I was almost totally unable to follow them, save for a few with enough English on the slides. Despite that, they were great fun to witness. The energy level was way, way higher than has become the average level at YAPC::NA, and most talks drew boisterous laughter. I wish I could've followed along more, but it was great to see everyone have such a good time. I haven't done a lightning talk in years. I better change that next year, assuming we have a YAPC::NA. Still no venue! (I'm voting for Philadelphia!)

The closing remarks came and went, mostly going over my head, and everybody left. I lagged behind, though, with a bunch of other foreign attendees, chatting and planning dinner. While we waited, the conference volunteers (numbering about sixteen thousand, as I recall) had a final meeting and did some group photos. It turned out that there were one or two staff shirts left unclaimed, and one of them was given to me! I have to say, it's almost certainly the best conference shirt I've ever gotten. I don't have any photos of me in it, at the moment, so here's one of the volunteer corps:

the YAPC::Asia volunteer staff

As for dinner, I was adamant that I wanted to eat ramen. Marcel seemed a little reticent to commit to eating in the bay, since the food scene didn't seem to be much of a scene, but we ended up sticking together, probably because I looked like I could not possibly be gotten to do anything else. I think we had a good time, though, and I'm not sure it would've happened without Marcel, who helped figure out how to get where we were going once we were at the right GPS location. (Several times in my trip, the GPS got me to a place, where I'd realize that it was more or less just getting me to the right block, and after that I'd have a million cubic meters to search through.)

Marcel, Marylou, Casey, and I made it to the food court of Aqua City Odaiba, a big shopping mall in walking distance of the conference. It was just a bit of a hike, but not bad. It was hot, though, and I drank several bottles of Pocari Sweat on the way there (and later on the way back). We all got ramen. I think Casey and Marylou got slightly tastier ramen than Marcel and I got, but I wasn't bothered, because the ramen I got was unlike any ramen I'd had previously. I also had some beer, which was a good accompaniment. It was Asahi Super Dry, which tickled me, because I'd just read about the "Dry Wars" that erupted in the Japanese beer scene in the late 80's. Super Dry is a totally acceptable Pilsner, which I enjoyed but didn't find outstanding. The idea of a "wars" fought over any title it might hold amused me. Of course, we had our Cola Wars, so I can't claim any sort of cultural superiority on our beverage pacifism.


(Note also my 3DS in that photo. Tokyo was a non-stop Street Pass fest. Every time I checked, I had ten or so new tags! I cleared a lot of levels in plaza games, I can tell you!)

On the way back, we stopped to see the giant Gundam statue near another shopping center in the bay. It was neat, but I was more interested in making another trip to Tokyu Hands, because there was one there. Sadly, it had closed about ten minutes earlier, so I contented myself with getting some shots of the robot before heading back to the hotel.

YAPC::Asia was over! YAPC::Asia has always seemed very short to me, as a conference. As a non-speaker of Japanese, there are a lot of events that I can't really take part in, and a lot of people in the hallway whom I can't accost and chat up. I know, from being on the other side of that, that it's hard to make someone feel comfortable in, let alone part of, a group that's quite literally foreign to them. I think that the organizers and attendees of YAPC::Asia did an excellent job of it, but in the end, there is a gulf left between me and the mass of attendees, and I think that contributes to things seeming to go by so far. It's hard to milk every minute as I might do in YAPC::NA. I hope to keep this in mind in the future, though, and to keep trying to make would-be outsiders feel welcome when I can.

I'll make one more post about my trip, covering my post-YAPC activities, in the next few days.

YAPC::Asia, day 1 (body)

by rjbs, created 2015-09-04 23:24
last modified 2015-09-06 20:38
tagged with: @markup:md journal travel yapc

I woke up really early on the 20th and did my best to kill time. I called home, I reviewed my slides and re-packed my bag. Part of my goal was to delay until breakfast was served, so I could eat something before heading to Ariake for the conference. I wanted to see whether the okayu had been relabeled, too! Around 6:20, though, I couldn't stand any more waiting, and I headed out. This way, I figured, I'd avoid rush hour.

It almost worked. Tokyo's rush hour is well-known for being crazy, and I'd be departing from Shinjuku station, one of the busiest stations in the city. I got to the station just after one train left and decided to wander around, looking for a bite to eat. I didn't have much luck, and headed back to wait for the next train, about a half hour. While I waited for the next train to Ariake, other trains came and went, and each one was busier than the last. By the time my train came, rush hour was clearly arriving. Still, I got on and even managed to get a seat pretty quickly.

I got off at Kokusai-Tenjijō, along with a huge crowd of men in identical suits. They all headed directly to Tokyo Big Sight while I headed to the hotel. I was amazed to see the conference center so close, since I'd totally missed it on my trip down before. Of course, then it had been dark and I'd been exhausted. This time, you couldn't miss it.

Tokyo Big Sight

I checked in, ran into Marylou, and headed to the conference to register and find something to eat. Marylou reported that the hotel's breakfast was only soup and bread. We ended up getting something at Starbucks, which was not great, but was food. We ran into Liz and Wendy, who got us to the actual venue. Big Sight is huge, and there was some danger that we'd just wander around aimlessly until someone took pity on us. Fortunately, that didn't happen.

Actually, YAPC::Asia had already prepared us for this, so it wouldn't have happened anyway. They not only posted step by step instructions on how to get to the conference area, they posted a video. This blew my mind. It walked you through getting from the main entrance to the conference area. You could pull it up on your phone and follow along and eventually you'd be there! The conference area was pretty busy. The volunteers were getting things ready, the attendees were showing up, and I was just sitting around zoning out. I wasn't too worried about getting registered first, and eventually I got my stuff and we got started.

I saw Larry's opening talk, which I'd seen before at FOSDEM and Salt Lake City. I wondered how many of the people in the room had read Lord of the Rings and the Hobbit, which got me wondering: how great would it be to give a talk based on an elaborate Soto vs. Rinzai metaphor? Well, I think it would be great, but I think I'd be one of very few people to enjoy it. Also, I'd have to relearn a lot of the things I've forgotten over the past fifteen years.

Anyway, after Larry's talk came Kelsey Hightower talking about Kubernetes, which was quite interesting and included a lot of flawless live demonstration. This was the first of many talks on containers at YAPC::Asia. I wish I'd been more awake for it! Sadly, I spent a lot of the day half awake. There was a provided lunch, at least in some quantity, and Karen and Marylou and I managed to get in just before they ran out. We had bento boxes, and they were tasty. I also got my hands on a Pepsi Refresh Shot, which was basically a five ounce can of Pepsi's version of Jolt Cola. I had a few of these during the conference. My consumption of these was definitely way, way below my Pocari Sweat intake, though.

After lunch, it was time for Matz's talk… in Japanese! Of course it was, but for some reason I'd held out hope that there would be English. Carlos, who I'd meet later, told me I could go find the same talk in English online, but I have yet to do that. I will, I will…

Casey gave a talk on distributed teams, which reminded me that Hackpad exists! I need to figure out whether I'd find it useful the way I used to find similar things useful. My guess is that since I haven't missed those very much, I'll live without Hackpad. Still, might be fun.

After that, I tried to stay social in the common areas, and time flew by. Pretty soon, it was time for the conference dinner. There was a huge spread, never-ending beer, and a really high ceiling, so the acoustics were good for conversation. I think I ate about four pieces of ziti and drank a fair bit of beer, but I think I was too jetlagged for it to have any effect on me. (There's a thought to file away for future trips, I guess!)

At previous YAPC::Asias, I found the dinner somewhat difficult, because it wasn't easy for me to go chat up random attendees, and I was very rarely appraoched by anyone there. This year, for whatever reason, that didn't seem to be a problem. I spoke with quite a few attendees, both local and foreign, and had a really good time. One attendee, who told me he was mostly an Android programmer, asked me about Perl 6. "I hear it's got some backward incompatibilities."

This kind of question would be inconceivable at YAPC::NA. Everybody is there because it's a Perl conference. YAPC::Asia isn't, really. It's a technical conference with a strong Perl heritage. One way to think about it is that it's a very good general tech conference for software engineers, but which has a much better chance of having a bunch of Perl content compared to a conference run based on trends. Going to a conference with lots of people from outside your usual circle is a good idea. If they're experts in the kind of thing you're not, even better!

This reminds me of the stories I've heard about people who up and decide to go attend a conference for psychologists or architects. Maybe I should do that, someday, too.

After dinner, a group of us headed back to the hotel, intended to have another drink before bed, but the hotel bar was full of people dancing in a circle and beating drums. I took it as a sign, went upstairs, and collapsed.

YAPC::Asia, day 0 (body)

by rjbs, created 2015-09-01 08:54
tagged with: @markup:md journal travel yapc

(Where's day -1? Well, I left home on the 18th (day -2) and got to Tokyo on the 19th (day -1), but since I didn't sleep between the two, they formed one virtual day for me. Day -1 was lost, like tears in the rain.)

I woke up way, way too early on the 20th. At the latest, it was around four o'clock. I tried to lie in bed, very still, pretending to sleep, but eventually I got sick of it and got up. I would repeat this pattern every day for the rest of my trip. I did morning stuff and spent some time reviewing my plans for the next few days. Eventually, breakfast was available and I went to eat some.

I ate a lot of breakfast, especially okayu. Okayu is the Japanese version of congee, rice porridge. It's what you give people who are sick or, apparently, heavily jetlagged. I ate a lot of it while in Japan, usually with a big helping of kelp. It was probably the best new breakfast food I've had in a long time. The only problem I had with it was the labeling. The okayu was in a big serving vessel labeled "gruel." I sent a polite email to the hotel explaining that only prisoners and Victorian-era orphans eat gruel, and to my great delight, they replied that they would change the label immediately.

My plan for the day was to get a day out on the town, doing stuff and keeping busy in an attempt to adjust my body clock. My body clock did not get adjusted, but I had a good time, anyway. I met up with Marty and Karen for lunch at Joël Robuchon's L'Atelier, where I'd failed to get lunch on my last trip to Tokyo. It was excellent. Marty cautioned me that theirs was not the best foie gras available in Tokyo, but I ordered it anyway, mostly because I am a risotto lover. I had a great meal!

At this point, I was absolutely stuffed. I'd had a big breakfast followed by a three course lunch. Karen said, "Well, I was going to suggest ice cream, but…" and I said, "let's do it!" We went to Snow Picnic, a liquid-nitrogen-using ice cream joint in Nakano. Karen and I had tried to go to National Geographic Travel's "second best place in in the world to get ice cream" in 2013, but found it closed for good, so she'd looked up where to go instead on this trip. We were both excited for it… but then we found that Snow Picnic was closed for vacation! Instead, we went to Daily Chikyo, a funny little soft service place in the basement of a shopping arcade. It was good, especially because it was so hot and humid.

Speaking of the heat and humidity, I should mention the vending machines. Everybody jokes about the many weird vending machines in Tokyo, but what I don't think people realize is how many there are. You can't walk more than a block without seeing one (and probably more) in most places. Some of them have soda, and some have water, but they all have tea, weird fruit drinks, and some kind of sports drink. The sports drinks (labeled "ion drinks") are a lot like Gatorade, but seem to come only in one flavor ("white") and have almost no sugar. I drank enormous amounts of Pocari Sweat, the most common sports drink around. It was so hot and gross that I could feel myself dehydrating all the time, and the vending machines of Japan were my constant ally in the fight against heat stroke.

We looked at a lot of little shops at Nakano Broadway, where Karen found some post cards and I wisely decided against buying a ¥18,000 Batman statue. We went to Book Off, a huge bookstore. Finally, we ended up at Tokyu Hands. I've heard a lot of people tell me how I should really go to Akihabara on my trips to Japan. It's a huge center for electronics and other of-interest-to-geeks stuff. I've gone, and it's cool. I think in general I find Tokyu Hands much more fun, though. It's like a giant combination of A.C. Moore and Staples, plus a bunch of other random stuff. I treated myself to some nice notebooks.

paper-oh quadro outside

Around five, Karen had to get back home. I got myself back to the hotel, did a little reading, and called home. Through an extreme effort of will I managed to stay up until eight o'clock, but that was that, and I fell asleep. I realized, that night, that the reason I'd managed to get onto Tokyo time in the past was that I was staying with Marty and Karen, and would stay up late every night chatting. If I find myself headed back to Tokyo again, sometimes, I'll have to make sure I've got evening plans at some kind of venue where it would be rude to fall asleep.

It was good to get to sleep, anyway, and the next morning I'd be up early (too early) to get down to Ariake for YAPC!

YAPC::Asia, day -2 (body)

by rjbs, created 2015-08-31 23:26
last modified 2015-09-06 20:38
tagged with: @markup:md journal travel yapc
YAPC: :Asia starts on August 20th with "day zero," with a talks that didn't make the main two days. I probably won't be there for much of that, since it's mostly Japanese content. Despite two prior YAPCs in Tokyo, I still can't understand Japanese. Go figure!

It's Tuesday the 18th, for me, and I'm headed to Tokyo, where I'll arrive on the 19th. Time zones, man.

I'll write up the conference after it's over, but I thought I'd kill some time on the plane by writing up the trip so far. It has been some good travel!

Last night we watched two episodes of Scream (so far: it's okay). Got to bed around eleven, with my alarm set for 4:45. As usual, because I knew I had to get up early, I slept terribly, waking up again and again, wondering whether I had overslept. (This was silly, Gloria would have made sure I got up, but it wasn't under my control!) I got a quick shower, did my last minute packing, and we were out the door. We stopped at Wawa and I got a breakfast sandwich on cinnamon French toast. American breakfast is a fine thing.

The bus ride was uneventful. I played some Animal Crossing. I finally had to get a haircut. I had to take the 5:20 bus to get to Newark on time, but that got me there almost three hours early. Security for Terminal A was the longest I have ever seen. That killed some time. I dozed a little. I played a little Minecraft with the kid. Mostly, though, for two hours, I walked back and forth through the food court, scowling at the ridiculous airport prices. Seven dollars for a soft pretzel! I showed them, though. I didn't buy a thing! That'll teach'm.

The flight to Toronto was short and uninteresting. They offered a free snack: two lemon crackers. I'd have scowled at that, too, but they were pretty good.

At YYZ, I had to go through customs, even though I wasn't planning to leave the airport. Nothing surprising there, but after customs, I had to walk down a long corridor, passing above the food court. I began to think I'd have to wait to eat on my flight… but then I saw the Air Canada lounge. I'm gonna go ahead and say it: the Maple Leaf Lounge in YYZ terminal E is my new favorite lounge. I had some pho and two gin and tonics (and some other stuff). It was big and roomy and clean and modern and I would not mind getting stuck there for a couple hours sometime.

(Dear Fate: this is not an invitation.)

When I went to board, the gate agent took away my boarding pass and vanished for a minute. When she came back, she handed me a new crisp boarding pass. "Complimentary upgrade today, sir." I can't remember the last time I flew business class, but it's been over ten years, and it wasn't as nice as this seat. It's a recliner, it's comfortable, and it has the best adjustable foot rest I've ever seen. The seat next to me is empty.

Then! Then! I poked around the in-flight entertainment system — which, by the way, is a heck of a lot better than that crap they give you in coach — and there was a documentary about A Tribe Called Quest. I watched that and then I ate some steak.

Hopefully on the rest of the flight I'll get some work done and get some sleep. I've got this fantasy idea that I'll adjust in-flight to Tokyo time. Probably not, but trying will give me something to do. I think that ATCQ documentary was the only movie on the list that I'd want to watch without Gloria.

(At this point, I closed my laptop and I never got back to writing any more in this entry until August 31st. This should be read as a good review of YAPC and my time in Tokyo. Of course, it didn't help that my laptop died on my last night there, and I just got it back today.)

Shortly after arrival in Narita, I met up with Casey West, his girlfriend Manda, and his former co-worker Marylou. We'd originally planned to make most of the trip into the city together, but when the dust settled, Casey and Manda were gone. I promised to help Marylou find her hotel, but this required first getting to my hotel where my wifi hotspot was waiting for me. This wasn't too difficult, but my hotel turned out to be more of a walk from Shinjuku station than I'd expected, and we were both pooped. Only through the strange reserves of strength that show up after 36 hours of wakefulness (plus a plate of sushi) did I stay awake long enough to get her to Ariake and then me back to Shinjuku.

I stayed at the Shinjuku Granbell, and my room was absolutely tiny. I was surprised by how much I enjoyed that. The bed was comfortable (and very, very firm). The water pressure was high. The smallness seemed to say, "Get out of here and do something else." This photo does not do justice to its size.

my tiny hotel room

I got back to the hotel pretty late and had a nice (but short) FaceTime call home. I was pleased, too, to be getting to bed so late. Surely, I thought, I'd be able to sleep until a reasonable hour the next day. Well…

trust no one (body)

by rjbs, created 2015-08-14 18:06
tagged with: @markup:md journal security

At work recently moved from our own office space to a coworking space. Bryan said, "remember to lock you laptop screen when you're not using it." I said, "I use Mobile Mouse so I can lock it with a hot corner from across the room."

He asked, "How does Mobile Mouse connect?"

The importance of the question was obvious. I knew it was wi-fi, and the wi-fi is shared with the rest of the coworkers. Surely anything that can remote control my computer will be a secure connection, right? Right? The docs said nothing, so I fired up a packet sniffer.

$ sudo tcpdump -i en0  -w mouse.packet port 9090
[ connect with Mobile Mouse, mouse around a little ]
$ strings mouse.packet

What did I find? Here's a sample:

[ a bunch of base64-looking stuff; I think it's the Dock icon images ]

There's my phone's device name, the password, my laptop's name, and a bunch of other identifying information. Anybody who sniffed the network for a while could find this traffic and then remote-control my laptop when I looked away. (Or, more amusingly, when I wasn't looking away.)

I asked the makers of Mobile Mouse why they didn't use a secure connection, and whether they would. They said, "Well, it's really intended for a secure local network, but we'll think about adding this feature." Still, they link to people who review this device as a presentation remote. This sounds like a recipe for at least hilarity, if not disaster. "Hey, the consultant is presenting with his phone on the guest wi-fi. Let's sniff it!"

My point here is not that Mobile Mouse is bad software. It's really good software with this one enormous flaw. My point is that nobody really cares about protecting you except for, hopefully, you. You had better pay attention!

no wrong way to play (body)

by rjbs, created 2015-05-24 22:03
last modified 2015-05-25 19:03
tagged with: @markup:md dnd journal

I am always baffled by the neverending stream of remarks of the form, "you people are playing D&D wrong." Here's one that particularly bugged me, today:

What SlyFlourish should be saying, here, is "hurts my ability to bring in and keep new players who care about the things I care about." Some players like playing in a very fatal environment. People play all kinds of games "on hard" on purpose, even games they haven't mastered on easy. And anyway, having a lot of character death doesn't make D&D harder, it just makes it different, because you don't win in D&D. And anyway, if you want to have victory conditions in D&D, that's cool, too.

It makes me crazy to think that people are being told, "you can't bring in or maintain new players if you let beginner characters die often." There are tons of games that work this way, and succeed in growing. I know: I have run some of them. Obviously, you have to know what your players expect, and what will make them unhappy. Part of this is asking, and part of this is establishing expectations up front. I make it clear that characters in my games die a lot, and that this is not about player failure, but about the kind of game I run. We still have fun. I have also played in games where the game master has gone out of their way to prevent character death when it seemed really justified, because they felt it would make the player unhappy. I still had fun.

I'm not a big fan of D&D 3E, but I thought its Dungeon Master's Guide II was great because it talked about how to establish and maintain a game based on what the players want. That's how you make a game succeed, after all: you figure out what you all think will be fun, try it, and then iterate on that. That's why "this is bad for players" makes me crazy. It's bad for some players. Or "I don't know how to do this in a way that players will like," which is a totally okay thing to be true. There's plenty of stuff I can't do, even though players might like it, and so I avoid it, because it would be bad.

The whole thing reminds me of [an episode of Parks and Recreation]. Ron "Mustache Guy" Swanson and Leslie "Amy Poehler" Knope have competing scouting groups. Leslie's group focuses on singing songs, baking cookies, and pillow fights. Ron's group struggles to build shelter and find something to eat. He tells his scouts, "We have one activity planned: not getting killed." By the end of the episode, all the scouts in Ron's group have defected to Leslie's, because they don't think Ron's group is fun. Leslie wins, Ron loses.

There is where a lesser show would end, but Parks and Rec is better than that. Leslie takes out an ad in the paper, calling for the kinds of kids who would like Ron's kind of camp.

Are you tough as nails? Would you rather sleep on a bed of pine needles than a mattress? Do you find video games pointless and shopping malls stupid? Do you march to the beat of your own drummer? Did you make the drum yourself? If so, you just might have what it takes to be a Swanson. Pawnee's most hardcore outdoor club starts today. Boys and girls welcome.

Then, some kids show up and they are excited to become Swansons. There is more than one way to be a scout.

So, here is my advice: ask your new or potential players what they want, or tell them what to expect. Or do both. Don't give up on what you like just because someone told you it was a niche style or that you'd be unable to retain players.

perl has built-in temp files (body)

by rjbs, created 2015-05-22 11:36
last modified 2015-05-23 19:31

I use temporary files pretty often. There are a bunch of ways to do this, and File::Temp is probably the most popular. It's pretty good, but also pretty complicated. A big part of this complication is that it's meant to keep your filename around until you're done with it, and to let you pick its name and location. Often, though, I don't need these features. I just need a place to stream a whole bunch of data that I'll seek around in later, or maybe just stream back out. In other words, instead of holding a whole lot of data in memory, put it in a file.

See, if you're going to put data in a file, then close it, then ask some other program to operate on it, it almost certainly needs a name. You might open that program and pipe data into it, but it's often much easier to just give it a filename of a file on disk. If you don't need that, though, the filename is totally extraneous. In fact, it just gets in the way by making it possible to leak disk usage. A filename is a reference to storage in use, just like an open filehandle is. Just like you can leak a RAM by leaving a reference to a variable in global scope, you can leak storage by leaving a name on the filesystem. That RAM will come back when your program dies, but the storage will wait until you erase the filesystem!

On most platforms, you can't create a truly anonymous filehandle, but you can do the next best thing: you can create a named file on disk, hang on to the filehandle, and immediately unlink the name. When your program terminates, there will no longer be any reference to the data on disk, and it can be freed.

Perl even makes this easy to do:

open my $fh, '+>', undef or die "can't create anonymous storage: $!";

This creates a file in your temporary directory (either $TMPDIR or /tmp or your current directory) with a name like "PerlIO_TQ50Oh" and then immediately unlinks it. The magic comes from the use of an undefined value as the filename. That mode, +>, is nothing special. It just means "create the file, clobbering anything that's in the way, and open it read-write." Now you can write to it, seek backward, and then read from it. This feature has been there since 5.8.0! If you can't use it because of your perl version, you have my sympathy!

Of course, maybe I'm weird in being able, ever, to make do with temporary files like these. I don't think so, though. When I asked on IRC recently, whether I was missing some reason that it wasn't more common, almost every single response was, "Woah, I never heard of that feature."

Now you have!

prev page
next page
page 1 of 53
1318 entries, 25 per page