Category Archives: software

Where To Find Info When Packages Break in Debian Testing

The chromium package in Debian testing broke a few days ago. After I ran “apt-get update” and “apt-get upgrade”, chromium disappeared from my Xfce menu, and the executable was gone from my system. Nothing like that has ever happened to me before. Odd!

When I tried to re-install it by running “apt-get install chromium”, I got the following error:

The following packages have unmet dependencies:
chromium : Depends: libudev0 (>= 146) but it is not installable
E: Unable to correct problems, you have held broken packages.

Indeed, there is no package called libudev0 (there is, however, a libudev1, which I already had installed). Mysterious.

Being fairly new to Debian testing, I was at a loss as to what to do. After some googling, I discovered some information that’s useful to users trying to troubleshoot broken packages.

I already knew that Debian has a searchable package database on their website. If you search for ‘chromium’ in the testing distribution, you’ll get to a page for it.

What I’d never noticed before were the links on the right-hand side. Every package apparently has its own mailing list archive and QA page.

The QA page isn’t the easily thing in the world to make sense of. I couldn’t find a simple listing of bugs in reverse chronological order, which would let me quickly see the newest bugs filed. The closest thing is the list of all open bugs. There is also a dashboard page which is vaguely reverse chronological, though it may be sorted by priority; it’s not clear.

In any event, this was good enough. I could see the bug for the error message I was getting. Turns out an update had mistakenly built the package for stable, which is why the unmet dependency was coming up.

It’s yet to be fixed, but at least now I know exactly what the problem is.

What Django Can (and Can’t) Do for You

I’m joining a team at work for the next few weeks to hammer out what will be my second significant Django project.

I’m not an expert on Django, but I have enough experience with it now to say that it facilitates the vast majority of web application programming tasks with very little pain. It’s highly opinionated and very complex, and has all the issues that come with that, but if you learn its philosophy, it serves you extremely well. And in cases where it doesn’t—say, with database queries that can’t be written easily with the Django ORM—you can selectively bypass parts of the framework and just do it yourself.

So I’ve been puzzled by complaints I’ve been hearing about how difficult it is to work with Django. There’s an initial learning curve, sure, but I didn’t think it was THAT bad. Yet over and over again, I kept hearing the grumbling, “why do I have to do it this way?”

A recent example came up with the way that Django does model inheritance. There’s a few ways to do it, with important differences in how the database tables are organized. You have to understand this in order to make a good choice, so of course, it takes a little time to research.

Having worked with Java’s Hibernate, I recognized some of the similarities in Django’s approach to addressing the fundamental problem of the impedance mismatch between object classes and database tables. Every ORM must deal with this, and there are nicer and less nice ways to deal with it. The point is, there’s no way to avoid it.

I realized that the complaints weren’t actually about Django per se, despite appearances. They were complaints about not understanding the underlying technologies. People were expecting Django to replace the need for knowledge about databases, HTTP, HTML, MVC architecture, etc. It doesn’t. That’s a poor way to approach what Django can do for you.

The metaphor of tools is useful here. If you gave me a set of sophisticated, high-quality tools to build a house, but I didn’t know the first thing about construction, I might complain that the tools suck because I can’t easily accomplish what I want, because I’m forced to use them in (what seems to me to be) clumsy ways. But that would be terribly misguided.

So the complaints weren’t about the merits of Django compared to how other frameworks do things. What they’re really saying is, “This is different from how I would do it, if I were writing a web framework from scratch.” Which is funny, because I’m not convinced they could invent a better wheel, given their limited knowledge and experience. (This is not a dig: I’ve worked on quite a few projects, many with custom frameworks, and doubt I could conceive of something easier to use and still as powerful as Django. Designing frameworks is hard.) Sometimes the complaints are thinly veiled anti-framework rants, which is fine, I suppose, if you prefer the good old days of 1998. But God help you if you try to create anything really complicated or useful.

Goodbye Ubuntu, Hello Debian Testing

This past weekend, I finally made the switch: I replaced Ubuntu with Debian testing on my main computer.

I really dislike the direction that Ubuntu has been taking lately. Don’t get me wrong: from a technical standpoint, Ubuntu is a great distro, the first and only Linux I’ve used where every single thing Just Worked after installation (I’ve run Slackware and Debian in the past, and maybe one or two others I can’t remember just now). I liked that its releases did a good job of including very recent versions of software. Without a doubt, Ubuntu has done a LOT to put Linux within reach of a wider user base.

But it’s come at a cost. Ubuntu 12.04, which is what I used to run, has spyware. (Here’s a good page with instructions on how to remove it, as well as make other tweaks.) Even if you like Unity, it’s a huge resource hog. And it annoyed me the way Ubuntu’s app store was so similar to the package manager: it seemed designed to lure people into the app store unnecessarily. The shopping results in Dash and privacy concerns were the straws that broke the camel’s back.

I get that Canonical is a business whose ultimate goal is to make money. I wonder if a subscription fee model would have worked for them. I would have gladly paid a reasonable amount to get a quality, user-friendly, up-to-date distro.

So yeah, I’m now running Debian testing on my Toshiba Portege R835 laptop. I chose Debian testing mostly because a lot of packages in stable are a bit too old for my tastes. stable is a great choice for the server, but for my everyday machine, I wanted the latest and greatest, or the closest thing to it that’s still fairly dependable. Debian testing fit the bill.

The install process is not as easy as Ubuntu, but it was fairly painless and seems much improved from years ago. A few notes on what I did:

  • Since I wanted “testing”, I used the latest daily snapshot of the Debian Installer.
  • On the first screen, I chose the advanced options to selected Xfce as my desktop, so I wouldn’t have to uninstall gnome later and install Xfce manually.
  • When the install process finished and I rebooted, my wireless didn’t work. The wireless device in my laptop is a “Intel(R) Centrino(R) Wireless-N”, which requires an additional package with firmware to be installed. Run “apt-get install firmware-iwlwifi” as root to get it, and reboot.
  • I changed my /etc/apt/sources.list file to use “testing” instead of “jessie” so that I would always be tracking the rolling testing release.
  • Getting Flash to work in the browser requires adding the “nonfree” section to the apt sources, and installing the “flashplugin-nonfree” package.

That’s it! Suspending my laptop works just fine, and connecting usb drives and devices works without any additional setup (which was not the case the last time I used Debian many years ago!). So far, all my applications have been working seamlessly with the old data I copied over.

I like having the peace of mind that Debian would never install spyware or intentionally compromise users’ privacy. Yes, it was just a bit more work to install, and getting non-free software that I unfortunately need to use for work is a bit of a hassle, and there will probably be small configuration annoyances in the future that make it less “magical” than Ubuntu. But I’m willing to deal with that.

I hope to replace Ubuntu with Debian testing on my desktop machine at work too sometime in the next few weeks. So long, Ubuntu, it’s been nice.

Goodbye, Sublime Text?

When one of my coworkers started using Sublime Text about a year ago, I was intrigued. I played with it and found it to be a very featureful and speedy editor. I wasn’t compelled enough to make the switch from Emacs, though. (You’ll pry it from my cold dead hands!) But I really liked the fact that you could write plugins for it in Python.

So for fun, I gradually ported my emacs library, which integrates with a bunch of custom development tools at work, to Sublime Text. It works very well, and the ST users in the office have been happy with it. Although I don’t actually use ST regularly, I’ve since been following news about its development.

What I discovered is that many of its users are unhappy with the price tag and dissatisfied with the support they received via the forums. So much so, in fact, that there’s now an attempt to create an open source clone by reverse engineering it. The project is named lime.

I learned about this with very mixed feelings. There’s a good chance the project will take off, given how much frustration exists with ST. Of course, the trend is nothing new: open source software has been supplanting closed source commercial software for a long time now. But this isn’t Microsoft or Oracle we’re talking about; it’s a very small company, charging what I think is a reasonable amount of money for their product. While they undoubtedly could do more to make their users happier, I imagine that they probably can’t do so without hurting what I imagine are pretty slim profit margins. That, or not sleeping ever again.

It’s not news that making a software product is much less viable than it used to be. Where money is made, it’s increasingly through consulting and customization, but one wonders about the size of that market.

It’s generally a good thing that open source has “socialized” software development: technology has enabled communities of programmers to contribute and collaborate on a large scale, in a highly distributed fashion, to create good quality software available to all, taking it out of the profit equation. The problem is that the rest of the economy hasn’t caught up with this new kind of economics.

I don’t mean to sound dramatic: there are many jobs out there for programmers, of course. But it saddens me that if you want to try to create a product to sell, it’s simply not enough to have a good idea anymore, in this day and age. It has to be dirt cheap or free, you have to respond to every message immediately, and respect every single feature request. Between the open source world and the big software companies that service corporate customers, there is a vast middle ground of small companies that is quickly vanishing.

Making Emacs an IDE

It’s that time when bloggers wax introspective about the past year. For me, the major personal revelation in 2011 was re-discovering something very old, and putting it to new use. For me, 2011 was the year of the Emacs IDE.

I’ve been using Emacs, on and off, for close to a decade now. What’s changed is that, in the past few months, I’ve been writing extensions for it. It started with a simple desire to better navigate files in a complex directory hierarchy that followed specific and somewhat convoluted conventions. At first, learning Emacs Lisp was simply a means to an end, but I ended up liking it so much that I started exploring Common Lisp (and more recently, Clojure, since I’ve worked with Java in the past).

What started as a small task has become a larger project of turning Emacs into an IDE.

To understand this, one needs to know some context about the system I work with. We developers edit tons of XML files and files in other text formats, which all drive our proprietary web application product. We have many command line tools that manipulate these files in various ways; the system was originally designed by folks who followed the UNIX philosophy of building orthogonal tools and chaining them together.

There are pros and cons to this system; for reasons I won’t get into, I don’t love it, but it’s what we work with right now. When I started the job, the vast majority of the developers used screen, vi, and the shell prompt. Typical workflows that involved working with only a few files could be extremely hard to keep track of, and usually required a lot of copying and pasting between screen and/or ssh sessions. Few people seemed to mind, but I found the workflow to contain too much extraneous cognitive load, and the state of the tools made development very prone to error.

Gradually, I’ve been integrating our tools into Emacs. Sometimes that simply means binding a key combination to running a diagnostics program and storing the output in a buffer. Sometimes it means accumulating that output for history’s sake. Sometimes it means parsing program output, processing it in Emacs Lisp, and inserting a result into the current buffer. Sometimes it means running external programs, even GUI applications, and tweaking them a bit to tell Emacs to open specific files you want to look at.

The productivity gains have been amazing. This is no reason to brag: managing several screen sessions with different vi and shell instances wasn’t exactly hard to improve upon. But Emacs made it fairly painless. Emacs Lisp has proved to be wonderful “glue” for integrating existing tools in our environment.

Writing tools that enable you to do other job tasks better is a really interesting experience; I’ve never done it to such an extensive degree. So far, one other person in my group is using this Emacs IDE, and she has been happy with how much it facilitates her work. Others who swing by my desk for something often watch me work for a moment, and ask, “how did you do that?! that’s not vi, is it?”

Getting more people to switch over means convincing them that the steep learning curve of Emacs is worth the gains of the integration I’ve done. I’m not sure how much that will happen, since a big part of it is cultural. But if there aren’t any more converts, I don’t really care. The best thing about this ongoing project is that I am the end user. The software I wrote is really for myself. It is something I use intensively every single day. And that makes it all the more gratifying.

Learning Lisp

For work projects, I often have to jump around a lot of XML files referenced in one another by name. The files follow a strict naming convention, but it’s one that looks like gibberish to human eyes. Typing them is difficult and prone to error; plus there are so many files in a given project that navigating a visual tree of the directory isn’t much easier.

So I wrote a small library for Emacs to find and open a file in the directory tree, if the cursor is over a string of text that matches a valid filename in the convention. I’m embarrassed to admit this took me two full days (a few other functions were in the library too). The only previous exposure to Emacs Lisp I had was tweaking my .emacs file, so I had to learn the language as well as hunt down the right functions to call for what I wanted. The work is paying off: it feels 10x’s easier to navigate project files now.

I experimented with trying to do this in Komodo Edit, a popular editor among coworkers, but it involved learning Mozilla’s XUL and Komodo’s own API, as well as writing javascript (yuck), so I abandoned that effort pretty quickly.

I didn’t think I would like Lisp, but the experience has been pretty fascinating, and I’m now making my way through the Practical Common Lisp book by Peter Seibel. It’s interesting learning a language that’s half a century old (!), and that’s influenced so many contemporary languages. One might think that there’s nothing there worth learning or revisiting, but that is so wrong. In particular, I’m trying to wrap my head around the power of Lisp macros and the way they allow you to create new syntactic abstractions. The idea of extending the very language itself, rather just adding new functions, is mind-boggling, to say the least. And, from what I understand, it remains fairly unique to Lisp in spite of the flood of new languages in the past few decades.

It’s disheartening not being able to find much info about who actually uses Lisp anymore, aside from hackers building modules for Emacs. Paul Graham has a cool essay, “Beating the Averages”, about using Lisp to build online store software that Yahoo eventually acquired. And ITA software, which makes an airfare search engine that powers the entire travel industry, uses Lisp. But aside from these bits of info, there isn’t much out there in the way of Lisp “success stories.”

Tradesmen-Programmers

It takes quite a bit of effort to decipher my work history. I’ve worked on enough diverse software projects that I’m not a “junior” programmer; but because of the time I spent in academia doing non-tech related studies, I don’t have a substantial enough career history to justify a “senior” status. Recruiters and prospective employers usually lack the patience to make sense of it all. Though my core software development skills are solid, I’m regarded as a misfit.

Three months ago, I landed a new full-time job. The company is full of highly intelligent oddballs with convoluted, non-linear professional histories—so I feel like I fit right in. I’m liking it a lot so far, and I feel lucky to have work in such an awful economy.

It’s precisely because of my weird background that I often forget exactly how much I know, and what I’m capable of. It occurred to me recently that I think of programmers as tradesmen. Most recent college grads with CS degrees are poorly prepared for real world software development. Genuine knowledge comes from hands-on experiences as “apprentices” to seasoned programmers. If you are lucky to work with good people, what you learn is not simply a particular programming language or technology, but a paradigm of core principles that will stay with you for life.

I’ve been lucky to work under some great “masters” of the trade. More than a decade after my first “real” programming job, for example, I still use the fundamentals I learned from coworkers on that project: proper object oriented design and modeling, what abstract data types really are and how they’re useful, the extreme importance of extensibility and maintainability in an “agile” world. These things have become such a part of me that I take them for granted.

When I meet or talk to other programmers, I listen carefully to how they talk about things, what concerns they raise, how they approach problems. These are what distinguish the tradesmen-programmers from those whose aim is just to “get it to work,” and who inevitably run into all sorts of problems because of that mindset. Tradesmen-programmers have to balance the practical and theoretical aspects of software design in order to create something of quality, in all measures of that word. And this is something that can only be learned through dedication, experience, and open-mindedness, not a degree, certificate, or book.

On Programmer Insecurity: Is it Personality or the Market?

Here’s a wonderful blog post by Ben Sussman-Collins, “Programmer Insecurity”, to which Jesse Noller has responded with “Programmer Insecurity and Mea Culpa”. (I don’t know either of these folks, I just follow their blogs in my RSS reader.) Ben talks about the need for more transparency, communication, and iterative growth in a programmer’s development:

Be transparent. Share your work constantly. Solicit feedback. Appreciate critiques. Let other people point out your mistakes. You are not your code. Do not be afraid of day-to-day failures — learn from them. (As they say at Google, “don’t run from failure — fail often, fail quickly, and learn.”) Cherish your history, both the successes and mistakes. All of these behaviors are the way to get better at programming. If you don’t follow them, you’re cheating your own personal development.

At the moment, I’m lucky to have fairly down-to-earth colleagues who generally foster these principles, but overall, this sort of perspective is sadly all too rare.

I don’t think it’s purely a matter of personality peculiar to programmers, or as Ben suggests, just “human nature” to fear embarrassment. I mean, sure, to an extent… but the fear is also fostered by a competitive labor market that values personal marketing over personal growth.

That’s why there are so many “best practices” blogs, vanity websites boasting of track records, and heated religious arguments about almost anything pertaining to code. The market has created a culture of showing off. And if you can demonstrate you are more “perfect” than the next guy or gal, you’ll impress the interviewer and land the job or the gig. One might argue, rightfully, that these are not great places to work. But places like Google where there is a generous philosophy of employee growth are probably the exception rather than the rule.

I can remember a time when things were different.

Making It Happen

A while ago, I had an idea for a cool website. It involved grouping blog posts under “debate” questions. There was a simple mechanism for auto-detection if you linked a blog post to a debate page. The point was to be able to group postings with more semantic richness than simple tags or categories.

It fell by the wayside, so I took it down after a few months. Recently I’ve been seeing sites pop up based on similar ideas. AllVoices is one, and Debategraph.org is another. It’s nice to see the idea of information richness continuing to develop in interesting ways.

And it makes me wish I had stuck with my site idea, though to be honest, it wasn’t realistically feasible. The hardest part wasn’t coding the functionality, which only took 2-3 weeks of the large pool of free time I had back then. (Side goals were to get a working knowledge of CherryPy and sqlalchemy, so at least those were accomplished!) No, the real difficulty was “selling” it to users: publicizing the site, making it visually attractive and user-friendly, and getting people to use it in their own blogs. I didn’t have the skills or resources to make those things happen.

There’s an adage that says success on the web largely depends upon execution, not the concept. That’s so true. I feel like I’ve known so many smart, talented technology people who excel at what they do but haven’t been able to pull off their interesting side projects. I think it’s because we often underestimate the non-technical challenges in getting a website off the ground. In many ways, those are more important to do well than solving the technological problems.

The oxymoron of perl best practices

I’ve been browsing Damien Conway’s excellent book, Perl Best Practices. His description of “best practices” in Chapter 1 is quite good, bestowing meaning upon an otherwise empty buzzword for marketing departments:

Rules, conventions, standards, and practices help programmers communicate and coordinate with one another. They provide a uniform and predictable framework for thinking about problems, and a common language for expressing solutions. This is especially critical in Perl, where the language itself is deliberately designed to offer many ways to accomplish the same task, and consequently supports many incompatible dialects in which to express any solution.

Simply put, best practices are the recognizably tried-and-true. The key word is recognizable: others should be able to identify the problem being solved as well as the specific type of solution. In a way, it’s redundant to speak of best practices when talking about the programming: unless no one will ever see your code, the foremost consideration of the programmer should be, Can other people understand what I’ve done easily and quickly, and also make changes and improvements easily and quickly?

That question is really at the heart of coding as a craft of logical elegance, I feel. Giving it due care and consideration is what distinguishes artisans from those who just “get it done” (they always seem to run into perpetual headaches down the line).

I find the philosophy of perl to be highly antithetical to the sort of shared understanding and communication described above. The motto of “there’s more than one way to do it” is self-serving and individualistic, to the detriment of maintainability. Perl seems to foster stubbornness and isolation: I’ll do things MY preferred way, and you do it YOURS. The codebase quickly devolves into baffling inconsistency. Of course, skilled perl programmers pride themselves on being able to know all the different ways of doing something, but at that point, the strength of the language isn’t the issue anymore; the ego of the coder is.

Anyway, back to Conway’s book. What’s interesting, but perhaps not surprising, is the number of “don’t do X” recommendations it makes. Leaving off parentheses on subroutine calls? Convenient, but ugly and confusing when a bunch are these are combined, so don’t do it. Tricks with dereferencing syntax? Save everyone headaches and stick to arrows. Pseudohashes? Stay away from them. Indirect object syntax? Potentially conflicts with subroutine names, so don’t use it.

Take away all the crazy tricks that perl devotees love, and what’s left? The only truly great thing about perl is its regular expression integration and text processing. It’s amazing stuff. But otherwise, perl doesn’t particularly excel at much else.

Which is not to say large complex projects in perl are impossible. The University of Washington’s Catalyst tools use a custom web framework written in perl, which they’ve open sourced. I only cite that particular example because I’ve used some of the tools and I know they’re nicely done. So sure, it’s possible, but it takes a lot of deliberate restraint and discipline, more so than with other languages and tools that are oriented towards helping you get your job done.

Seems to me that if you must willfully refrain from using what a language offers, and constantly fight the temptation to write mangled code that costs time, money, and frustration for others to decipher, perl isn’t a particularly strong choice. In the case of perl, best practices aren’t a way to get the most out of the language; they’re a stopgap measure to stave off its natural tendency towards chaos.