LaTeX Preview with Vim and Evince

During a conversation (and a game of Scrabble) at the Google Summer of Code mentor summit, it came up that a few of us were LaTeX users, and we talked briefly about how it would be nice if there were some way to get a real-time preview the final document while editing in Vim. Here's my solution.

I use make to build PDFs of my LaTeX files. A typical makefile looks like this:

LATEX= latex
DVIPS= dvips -j0 -Ppdf -u -G0 -t letter -D 1200 -Z -mode ljfzzz
PS2PDF= ps2pdf -dEmbedAllFonts=true -dSubsetFonts=true

NAME= foo
FIGURES= images/*.eps

all: $(NAME).pdf

$(NAME).pdf: $(NAME).ps
    $(PS2PDF) $(NAME).ps $(NAME).pdf

$(NAME).ps: $(NAME).dvi
    $(DVIPS) -o $(NAME).ps $(NAME).dvi

$(NAME).dvi: $(NAME).tex $(FIGURES)
    $(LATEX) $(NAME).tex; $(LATEX) $(NAME).tex

    rm -f *.dvi *.ps *.pdf *.aux *.log *.lof *.lot *.toc

Knowing that Evince updates its view automagically when a file changes, I just added a post-write hook to my ~/.vimrc to run make:

autocmd BufWritePost,FileWritePost *.tex !make

Now, whenever I write the file out, my Evince window updates with the latest output.

I haven't yet checked out Latexmk, which can supposedly effect similar results, and save me the trouble of maintaining a makefile to boot.

Loggerhead Init Script for Gentoo

I just set up a Bazaar repository server at work. Gentoo has no official ebuild for Loggerhead, so I installed it from Mark Lee's Bazaar overlay. Unfortunately, this does not ship with an init script for serve-branches, so I wrote one.

The script is /etc/init.d/loggerhead (mode 755):

# Copyright 1999-2009 Gentoo Foundation
# Distributed under the terms of the GNU General Public License v2
# $Header: $


depend() {
    need net

start() {
    ebegin "Starting loggerhead"
        start-stop-daemon --start --quiet --background \
        --make-pidfile --pidfile ${PIDFILE} \
        --exec /usr/bin/serve-branches -- --log-folder=${LOGDIR} \
    eend $?
stop() {                        
    ebegin "Stopping loggerhead"
        start-stop-daemon --stop --quiet \
        --pidfile ${PIDFILE}
    eend $?                 

This uses a single entry from /etc/conf.d/loggerhead:


It seems to work. When I get the chance I may patch the ebuild to include it and suggest it to the maintainer.

On Path and Edge Strengths in Fuzzy Graphs

How does one define the strength of a path in a fuzzy graph? Mathew and Sunitha state that "the degree of membership of a weakest [edge] is defined as [a path's] strength" without any apparent justification. Saha and Udupa mention that several measures, including sum, product, and minimum, all seem plausible, but also ultimately choose the minimum for their purposes (based on a set of axioms related to their problem domain).

However, many methods of combinatorial optimization which operate on weighted graphs assume that the weight of a path is simply the sum of the weights of the edges which comprise it. In order for me to use such methods unmodified, I must define edge weight in terms of edge strength, and define path weight using the sum. I cannot, therefore, define path strength directly, and certain definitions (including the apparently popular minimum-edge one) are impossible to achieve this way.

Naturally, since the weight represents a cost of some sort and smaller weights are more desirable, edge weight should be inversely proportional to edge strength μ. In FuzzPy, I simply take the inverse of μ, which I now realize is naïve. In fact, a more correct basic definition would be 1/μ - 1, so that an edge with μ = 1 would result in a weight of 0.

A model of pairwise camera field of view overlap in a multi-camera network (fuzzy vision graph) is built of generally intransitive binary relations: if camera A overlaps with camera B, and camera B overlaps with camera C, camera A does not necessarily overlap with camera C, and almost certainly not to the degree implied by a transitive fuzzy relation. This intransitivity is the final nail in the coffin for non-sum-based path strength definitions in my problem domain.

Returning to the definition of edge weight, now in the context of multi-camera networks, there is something missing. Forgive me a contrived example of a problem such as calibration optimization. Intuitively, a path of eight edges each with μ = 1 is likely to be less optimal than a path of a single edge with μ = 0.9, because the definition of μ does not fully encapsulate the pairwise calibration error (not least because the μ values are normalized). Should edge strength be defined as 1/μ - 1 + &alpha, where &alpha is some fixed accuracy (or other) cost for a single hop? My aforementioned naïve version (effectively, with &alpha = 1) has been working reasonably well in experiments. However, this extra parameter is not intrinsic to the fuzzy vision graph model, and thus must be defined by the optimization application.

The rationale behind the fuzzy vision graph is that it is a "quick and dirty" model of a camera network's visual topology, and I think defining additional application-specific things like this α parameter at the optimization stage is appropriate.

Thoughts in Kaoss

Yesterday, I spent the afternoon implementing the KMod hack on my Kaossilator thanks to a most excellent set of instructions. I used a DE-9 male connector, so I actually have 2 pins free for potential future additions (I wonder if I can cram something in there to output a MIDI-compatible tempo clock signal based on whatever is driving the little dot on the display).

I have plans for two things to plug into it now.

The first will be a simple little stub that just sits in the connector, with a 47K resistor and a switch in series between the SUO and SUI pins. I'll make this as small as possible, and leave it attached most of the time for a portable sustain switch.

The second is my ambitious external control box. I'll start with a basic set of buttons and toggle switches that control all the on-board functionality, including single buttons/switches for combos like key, loop length, and erase.

Next, I'm going to look into the feasibility of adding a port and circuit to allow some kind of synchronization from an external MIDI clock signal (probably not a huge deal, but I don't yet know anything about MIDI really). This will be useful for syncing up with my x0xb0x, whenever I finish building it.

Finally, and most interestingly, I want to control functionality via my USB Boarduino. I'm going to develop a GUI application in Python (working title Kaosslab) that lets me do some really cool stuff. One tantalizing exampe is tempo-synchronized recording of loops (by having Kaosslab activate the loop button and record for the appropriate amount of time based on user-supplied tempo).

I also recently put up Kaossilator Fu on my web site. I'm working on populating it with every Kaossilator tip, trick, and hack I can find, as well as some videos of phat Kaossilator jams (like this one and this one and this one).

Rollerblade Odometer

I want a pair of rollerblades that, using simple technology (no GPS), can fairly accurately report to me how far I've traveled. I want an odometer readout that I can reset before each trip. How can this be accomplished?

Add rotary encoders to the front and back wheels on the skate. We want the frictionless optical tachometer type; the direction of motion is irrelevant (picture someone skating backwards, for example). We can obtain a fairly accurate measure of the distance traveled by the skate by always recording the angular displacement of the faster-turning wheel. This can be accomplished by having each encoder increment its own small counter (say, a 4-bit one). When one of the counters overflows, it sends a signal to a large main counter and clears both small counters. The large counter is enabled by a pressure switch (maybe a piezoelectric sensor with a threshold for binary output) able to determine that the skate is in fact contacting the ground. At this point, assuming we've been intelligent about the encoder pitch relative to the wheel size, and done the appropriate trickery in the counting logic, we have a skate that can accurately measure its own ground distance traveled in some useful unit.

The major issue now is that both skates are sometimes, but not always, contacting the ground, and there is no way to know a posteriori, when observing the results, how much overlap to subtract. I don't want to introduce any concept of time into this design, so what we need is some way for one of the skates to know whether the other is contacting the ground. We can accomplish this by having the pressure switch on the left skate enable an RF transmitter, with a simple structured signal that a receiver on the right skate can robustly identify. The receiver can then disable the counter on the right skate while the left skate is transmitting. Now, totalling the large counters in both skates should yield a fairly accurate total ground distance traveled.

However, as a human being, I can't tell what is in those counters, and even if I could, I wouldn't want to have to add them up in my head. We can add a button to the left skate that triggers a different RF transmission to the right which encodes its counter value and then resets it. The receipt of this signal on the right skate prompts it to add the value to its counter register and show the value on an LED display for a few seconds. This can be done repeatedly, since the left skate is just dumping its current counter value into the right skate each time, where the total is retained. The corresponding button on the right skate would simply reset its counter.

Some issues for further consideration:

  • Since the transmissions are one-way, the signal needs to be robust. We also assume that the rollerblades are both powered and in proximity to one another.
  • Power management. Should the skates power off after a period, and power on via the pressure switch? We likely want the main counter registers to be non-volatile.
  • Pressing a button on the left skate to activate a display on the right skate seems somewhat awkward from a user perspective.
I really ought to be doing more important things...

Penguicon The Seventh

Like Xavier, I came back from Penguicon 7.0 this weekend to a mountain of work. Now I'm going to walk and talk like him (minus the epic growl) -- guess that means I look like someone else here -- and give my review.

The highlight was certainly the party. Friday night was phenomenal, Saturday night even more so. Where else can you be waylaid by a pirate ship at the top of the hotel lobby stairs, and told to drink rum and walk the plank to join the crew? Best Penguicon yet, on this basis. You have to be there to know.

The panels were good this year too, as usual. We could have attended a few more if it hadn't been for a certain WTF line. Here are my thoughts on the ones I did catch:

Sustainable Computing (Jon "maddog" Hall)

A great forward-looking keynote by maddog. He deftly connected the idea of scalable distributed mesh networking for cities with providing free Internet access to kids (a la OLPC, but with fewer technical challenges), with benefits to everyone else too, and with environmental sustainability. And, naturally, he gave some highly compelling arguments as to why the sensible thing to do is use free software to implement it. A

Wil Wheaton Reading (Wil Wheaton)

Ensign Crusher, report to Penguicon. Ensign Crusher? Ensign Crusher, respond! F

Open Hardware Overview (W. Craig Trader)

A brief introduction to "open source" hardware. A big chunk of the talk was devoted to a few examples, which was surely a yawn-fest for anyone who reads hardware hacking feeds. The more interesting parts of the talk were the breakdown of the board prototyping process and the explanation of how projects apply licenses like Creative Commons to hardware design. A bit pedestrian, but not bad. B

Beginning Pygame Programming (Craig Maloney)

To be fair, I was very much looking forward to this one, so it had a lot to live up to. The talk consisted entirely of showing various stages of development of a Pong game demo. Much time was spent figuring out which revisions would actually run (blind commits are evil). A reasonable amount of Pygame functionality was used, but there could have been more explanation. Good concept, but tighter implementation necessary. C

Open Hardware with Arduino (W. Craig Trader)

This talk really made me wonder why Trader spent so much time talking about the Arduino platform in his previous talk. My main complaint is that he didn't really contrast the advantages of the Arduino against other microcontrollers and evaluation boards, which probably left most people with a somewhat distorted perception. More original content, such as some clever uses and maybe a non-trivial demo, would have been nice as well. There was lots of good information about existing projects and add-on devices. I'll give it a pass because it got the others interested. C

Rule-Based Programming in Interactive Fiction (Andrew Plotkin)

As an engineering grad student, I'm quite used to dry technical seminars, but I want Penguicon to entertain me more. That aside, awesome talk! I hadn't really thought about how awkward it must be to program IF in an object-oriented programming language until this talk. Very interesting concept about how to attack the problem with a rule-based syntax model. Some of it brought to mind aspect-oriented programming. Bonus: Andrew really likes to talk about heads exploding. B

Looking forward to Penguicon 8.0! I'm hoping to get my Thousand Parsec talk in this time.

How Random

A Python function that returns a random subset of size n from set s.

from random import randint

def random_subset( s, n ):
    if len( s ) < n:
        raise ValueError, ( "Subset larger than input set" )
    l = list( s )
    r = set()
    for i in range( n ):
        r.add( l.pop( randint( 0, len( s ) - i - 1 ) ) )
    return r

I'm dealing with fairly small sets, so this may not be the most computationally efficient way to do it.

Das Komputermaschine Ist Fur Der Gefingerpoken

A good friend of mine recently tossed me some computer parts, including an HP illuminated multimedia USB keyboard (model SK-2565, part no. 5185-2027). Since I had been looking to replace my old keyboard (a $10 PS/2 job that I turned into a k-rad all-black cowboy deck with blank keys), and had been suffering from an inability to control my PCM volume or music from the keyboard without launching alsamixer or mocp respectively, a particularly acute problem when playing StarCraft, I found herein an opportunity.

HP SK-2565 USB Keyboard

This keyboard has nineteen buttons and one knob across the top. In order, they are (or look like) sleep, help, HP, printer, camera, shopping, sports, finance, web (connect), search, chat, e-mail, the five standard audio buttons (stop, previous, play/pause, next, load), a volume knob, mute, and music. Since the keyboard was furry enough to qualify as a mammal upon receipt, the first thing I did was clean it, a process which spanned several hours (though the process was niced down somewhat). The previous two sentences are related: the top buttons also happen to be built in such a way as to require utterly complete disassembly of the keyboard to remove and replace, and I am ashamed but not at all surprised to say I got the replacing part wrong. The play/pause button is now swapped with the previous button. And I am totally not taking this thing apart again any time soon.

But it is for the best! After figuring out sometime later that I had goofed, I decided (Daniel Gilbert, this one's for you) that I liked it better this way anyway. Which is perfectly fine, of course, since I'm about to get to the good part: how I made my HP illuminated multimedia USB keyboard special upper buttons work in Linux, using Xmodmap, and in awesome, using rc.lua.

Turns out it's extremely easy to bind arbitrary keycodes to keysyms (a full list of which can be found in /usr/share/X11/XKeysymDB), at least using GDM. By default (on Gentoo), GDM loads /etc/X11/Xmodmap, as specified by the sysmodmap setting in /etc/X11/gdm/Init/Default. Mine now looks like this:

keycode 223 = XF86Sleep
keycode 197 = XF86Shop
keycode 196 = XF86LightBulb
keycode 195 = XF86Finance
keycode 194 = XF86WWW
keycode 229 = XF86Search
keycode 121 = XF86Community
keycode 120 = XF86Mail
keycode 144 = XF86AudioPlay
keycode 164 = XF86AudioStop
keycode 160 = XF86AudioMute
keycode 162 = XF86AudioPrev
keycode 153 = XF86AudioNext
keycode 176 = XF86AudioRaiseVolume
keycode 174 = XF86AudioLowerVolume
keycode 118 = XF86Music

And now, the answers to all your questions:

  1. I figured the keycodes out by running xev and banging on the buttons.

  2. XF86LightBulb is the closest thing I could find to "sports" that wasn't already taken.

  3. The volume knob "clicks" and sends a keycode 176 or 174 depending on the turn direction.

  4. I did not map help, HP, printer, or camera because they do not appear to generate keycodes.

  5. I did not map audio load because I forgot. I will do it when I can think of an action to bind it to.

The next step was to make these keys actually do something in my window manager. Bindings are pretty easy to make in /etc/xdg/awesome/rc.lua. Without getting into too much detail, I bound keys to things. I am particularly impressed with how I can control audio via amixer, and my MOC playlist via commands without even having the interface open. Another bonus is the sleep button running xlock. Here's a sample line:

key({ }, "XF86LightBulb", function () awful.util.spawn("starcraft") end),

A particularly nice one is the search button, which runs the following script (be nice, my bash-fu is rusty):

Q=`zenity --entry --width 600 --title="Google Search" --text="Google search query:"`
if [[ "$Q" != "" ]]; then
EQ=`echo $Q | sed s/\ /\%20/g`

I frequently say that if I took one thing home from working in the automotive sector, it was Kaizen.

Free Software in Vision

My research area at school is distributed smart cameras, a field which is primarily rooted in computer vision. Despite having access to a range of expensive proprietary software libraries by virtue of having purchased the equipment, most of my computer vision work uses a stack of free software running on Gentoo Linux.

For interfacing to the cameras themselves, we have the excellent libdc1394, a high-level API for interfacing with IEEE 1394 cameras supporting the IIDC specification (which our Prosilica EC1350s, among hundreds of others, do). The Coriander GUI makes configuration and control a snap. The ebuilds available in Portage have so far worked flawlessly for me.

Many computer vision tasks are covered by OpenCV, a former Intel project that is gaining a lot of momentum with academic open source developers worldwide. When I first considered it in early 2006, it had a long way to go in terms of maturity. However, after seeing Gary Bradski's talk at ICDSC 2008, I decided to give it another look, and was pleasantly surprised to find out that O'Reilly had just published Learning OpenCV (co-authored by Bradski), and that it was an excellent practical introduction to the library. The latest stable release for Linux at the time of writing, 1.1pre1, shows signs that this library is becoming quite robust. It seems to finally be moving from a simple collection of algorithms toward a fully functional general-purpose computer vision library. The feature list for the June 2009 release has me excited, particularly because of the better Python interface and some big improvements in feature detection and 3D stuff.

Computer vision and related algorithms tend to use a lot of linear algebra, and depending on whether I'm coding in C or in Python, I use the GNU Scientific Library or NumPy, respectively. Both are excellent numerical libraries. I used NumPy fairly extensively in developing PyDSSCC for my Master's thesis.

My personal Gentoo overlay has ebuilds for both OpenCV (which tends to lag the release version in the official tree) and Gandalf (which is not in the official tree).