ONA09 un-conference session proposal: The craft of making software — Anyone interested?

UPDATE — The folks at ONA have announced that they’ll provide rooms for un-conference talks! Woot! But if there’s more need than space, it’ll be up to a vote, so *please* get there early and vote me up! Hope to see you there!

The schedule for ONA09 is jam-packed with shiny stuff — social networks, mobile tech, they’ve even got Leo and Ev! Great. But the reality is that Twitter will not save the news, just like chrome rims can’t save General Motors.

We can talk about technologies, tools and innovators all weekend long, but it won’t help if news organizations don’t understand the basic principles of software development. So, if anyone out there is interested, I’d like to arrange an un-conference to talk about some un-shiny, boring-ass shit: software development methodologies.

Topics of import we might address:

  • Version control
  • Task and defect tracking
  • Goals, use cases and designing with your audience in mind
  • Working iteratively and being agile

Code is not something you can slap up like wallpaper. Making software is a craft. It requires discipline and skills far beyond a superficial awareness of the technologies available. At every moment of the process, from conception to release, there are right and wrong ways to make software.

Imagine a news organization with only writers, and no editors. They might manage to crank out some successful stories, but without editorial controls, the failure rate would be astronomical. From what I’ve learned in my (admittedly brief) time in this industry, this is the state of software development at newspapers — it’s failure-ridden, amateurish and ad-hoc.

Let’s do it the right way

Over the years, lots of clever people have studied the craft of software development, and come up with battle-tested tools, best-practices and processes to reduce the failure rate and better-ensure success. I learned a thing or two about these methods in my previous life, and would love to share.

So, I’d like to set up an un-conference session. We’ll get a room and a projector and talk process. Who’s interested in attending? What topics would you like to see addressed? Would anybody else like to present?

(If there’s no response, I’ll shut up and go back to work — but if I’ve convinced you, please leave a comment. No comments, no un-conference.)

Twitter lightning talk

Wikipedia sez: “A Lightning Talk is a short presentation given at a conference or similar forum. Unlike other presentations, lightning talks last only a few minutes and several will usually be delivered in a single period by different speakers.”

This particular lightning talk was delivered to the ProPublica newsroom a couple weeks ago. To experience it best, open all the links in tabs, print the talk, then read through it and flip the tabs as quickly as possible. (Warning: may cause seizures.)

What’s Twitter?

Really, what’s Twitter?

So it’s like a blog?

…so, how does this all work?

Replies, retweets and links, oh my!

Searches, hashtags, and trends

The Twitter website sucks

  • Desktop applications like Twhirl and TweetDeck make Twitter immediate. You use them to tweet and to see replies and search results, live, similar to how you’d use Gchat or AIM.

I know kung fu.

  • Twitter can be like your own Headline News, but tuned to your
    interests. You can know, to the moment, what’s happening with people
    and topics you care about.
  • With a well-configured TweetDeck, you can hear the Internet hum.
  • We call this experience “ambient intimacy.”

…Twitter for journalists

Tweet your beat

Ask for help

Be aware

Find a job

  • I tweeted two weeks ago that my friend wanted a job at Playboy. Jimmy Jellinek called her last week, and this morning she got the job. I’m not trying to take credit for this, but it really was all me.

And remember, if you don’t tweet, they will.

Some members called it a new age of transparency, a bold new frontier in democracy. But to view the hodgepodge of text messages sent from the House floor during the speech, it seemed as if Obama were presiding over a support group for adults with attention-deficit disorder.

Further reading

Feeds, tweets and APIs are the beginning. Will news orgs step up to augment reality?

In her TED talk, Unveiling the “Sixth Sense,” game-changing wearable tech, Pattie Maes demos a system that creates interactive visual layers over the real world. The actual implementation, a tiny projector tied to a wearable computer that watches your fingers for input (using colored marker caps to identify fingertips!) is cheap, but not something you’d likely want to wear to the store.

But imagine for a moment a similar system, one that detects more subtle gestures and does not physically project light onto the objects you’re manipulating. A device that annotates the real world and presents information about the person in front of you, the product you’re considering purchasing or the comparitive likelihood of catching a cab at this corner or the next block over.

Map for driving by eszter
Map for driving by eszter, based on MacGyver Tip: Heads up display with a reversed paper map from LifeHacker.

I’ve blogged about this before, but Maes’ talk reminded me how important this technology will be. It *will* happen, and although there’s much work left to do in the end user interface (Rainbow’s End by Vernor Vinge, and Counting Heads by David Marusek present brilliant visions of how they might work) the inputs to these systems are coming online today.

Feeds, tweets and APIs aren’t just for the web

Twitter, when paired with TweetDeck gives me an always-on, ambient awareness of events worldwide. Its like a tiny, quiet news radio, feeding me timely information on events I care about. When I’m at my desk, I can hear the Internet hum. Soon, that spatial restriction will be lifted.

I already use Amazon from my phone’s web browser when I’m shopping, but the APIs are there to build new, better interfaces, that, as the Maes’ demos in her talk, can port Amazon ratings and everything else into the real world.

The NYT’s and The Guardian’s new APIs are similarly useful, but present even richer information. Detailed, expert analysis of not just products, but news and events. (And surely Bittman’s recipe for Roast Chicken With Cumin, Honey and Orange would be handy to have on a heads-up display, at the grocery store, when cooking, and when you’re regaling friends with the elegant simplicity of roasting a whole bird.)

Who’s building the future?

Of the 1180 APIs cataloged at ProgrammableWeb, only 18 are categorized as “news”. If news orgs want to hang on to their last shred of credibility as the essential information providers of the last century, they’d best get on it.

APIs are the future of information, and the content creators who adopt them will augment our reality.

Three reasons the new Tribune tabloid should be free, a twitter serial (republished in blog form)

The Tribune announced this week that they will begin printing the fat Chicago daily as a tabloid! Huzzah!

The Tribune’s move, replacing its broadsheet edition with the tabloid version at the retail level, is an aggressive bet that a switch in size will improve sales. There are no plans to make the tabloid-sized edition available for home delivery

I thought today’s tweets on the topic might be good to republish.

Brian Boyer
brianboyer Three reasons the new Tribune tabloid (http://bit.ly/qLN3) should be free, a twitter serial. Bring on the haters
Brian Boyer
brianboyer 1 If the paper is free, I pick it up. (Even the Red Eye, which runs 1/3 celebritrash, not including sports, neither of which I care for.)
Brian Boyer
brianboyer 2 On a tabloid page, I notice (and occasionally read!) the ads. On a broadsheet, they’re just annoying blocks to reading.
Brian Boyer
brianboyer 3 TribCo’s current gratis daily is crap. A proper paper would bring a new age of enlightenment in Chicago. Embiggen, Obama, Olympics.
Brian Boyer
brianboyer = $$$ from more effective ads shown to a larger audience, plus an improved, vital brand, loved by a better-informed populace. You dig?

Am I nuts? Riding the train this morning, I saw a lot of glossy-eyed folks reading the Red Eye, and a few diligent readers struggling with the crowd and the fatty broadsheet.

From concept to sketch to software: Building a new way to visualize votes… mmm, environminty!

Ryan Mark and I built enviroVOTE to help people visualize the environmental impact of the 2008 elections. We designed it in two evenings and made it real in a three-and-a-half-day long bender of data crunching and code.

This is the story of that time.

Sketch of enviroVOTE
+ coffee = enviroVOTE is real, live software

Sunday evening, 26 October: the concept

The idea struck us when Ryan and I discovered we had a common problem: homework. Ryan was on the hook to produce a story about the environment for News 21‘s election night coverage, and I needed to build an example presenting news data in some interesting way using charts and graphs. So we decided to combine our efforts and make something that would visualize environmental information about the election.

We searched for data to present, and found that it came in many shapes; like a candidate’s track record of support on environmental issues, or statistics on national parks, nuclear power and everything in-between. But the most compelling data set we found was not stats- or issue-based: endorsements made by environmental groups.

Statistics were cut because they’re only peripherally related to the races being run. It’s not particularly interesting to say something like “in states with more than five hydroelectric power sources, the democratic candidate prevailed 18% of the time.”

Only sportscasters can get away with that crap.

Why not issues, then? They’re hard to quantify. Candidate websites are frequently slippery, ambiguous things, and we found that few politicians responded to efforts that would make their positions crystal clear like Project Vote Smart’s Political Courage Test and Candid Answers’ Voters Guide to the Environment. The best data we could find were candidates’ voting records, but without understanding the nuance of each piece of legislation, it’s nearly impossible to determine if a vote was for or against the goodness of the earth. (Also, only incumbents have voting records.)

An endorsement is a true-false, unambiguous, easy to count thing. Environmental groups like the Sierra Club and the League of Conservation Voters publish their support for candidates online. Even better, the aforementioned Project Vote Smart — a volunteer group dedicated to strengthening our democracy through access to information — aggregates endorsements, and makes them readily available for current and historic races. And Vote Smart makes them available via an API, so others can mash up their data, just like we were itching to do.

Wednesday evening, 29 October: the design

A second fury of inspiration led to the design of the site. Marcel Pacatte, my instructor and head of Medill’s Chicago newsroom, was our source of journalistic wisdom. He and I identified our audience and discussed the angles and presentation methods that would best serve them. Obvious ideas like red/blue states and a map of the nation’s greenness were tossed — maps aren’t all that good at showing off numbers. (Notable exceptions include cartograms and the famous diagram of Napoleon’s march to Moscow, neither of which seemed sensible metaphors to adopt.)

Working out the enviroVOTE concepts on a whiteboard
Scope creep, be damned!

We decided to not make a voter’s guide, since there was little time before the election for folks to find the site, and to instead make something that’s interesting the day of the elections, and useful in the days following. So we looked for numbers to support that mission.

Counting environmentally-friendly victories would be both timely on election night, and purposeful later. We could calculate a win for the earth by counting endorsements: if the winning candidate had more endorsements, it was a green race. This was easy to aggregate nationally as well as by state.

And by running the same numbers on the previous races (two years ago for the House, six for the Senate, etc.) we could calculate the change in the environmental-friendliness of the nation’s elected officials, a figure that became known as “environmintiness.”

In addition, some races potentially held more impact for the environment than others — because of their location or the candidates running — so we decided it was necessary to highlight these key races alongside the numbers.

The sketch that served as the primary design document for enviroVOTE
The sketch that served as the primary design document for enviroVOTE

In a whirlwind sketch-a-thon, the design for the site flew together. We would show off the two big numbers in the simplest possible way. No maps, pies or (praise the lord!) Flash necessary. They’re both just percentages. To set off one from the other, we decided on a percentage for the percent change, and a one-bar chart for the victory counts, in aggregate and for individual states.

Users would be interested in seeing results from their home state, so we made the states our primary navigation, and listed them, along with their bar chart, down the left side of the page. (We explicitly decided to not use a map for navigation, like most sites do. If I lived in Rhode Island, I’d effing hate those sites.)

Putting the big numbers front and center and listing the incoming race results down the right gave users an up-to-the-minute snapshot of the evening. The writeups about key races, though important, were our least timely information, so we made them big and bold, but placed them mostly below the fold.

We produced a simple design, just three pages — home, a state and a race — each presenting more detail as you drilled down.

Saturday and Sunday morning, 1-2 November: the development

Development began Saturday morning. We decided to build the site on Django, the free and open source web development framework that we were concurrently using to build News Mixer, the big final project of our master’s degree program. (If you’re interested in our reasons why, and how it all works, check out my post that explains the same stuff re: News Mixer.)

We brainstormed names for our new baby, and immediately checked to see if the urls were available. envirovote.us was the first one we really liked, so we bought it and started running. Ryan designed a logo and whipped up a color scheme, and thus a brand was born.

Improvising the details, we built the site very closely to as it was designed. (The initial sketches were mine, but Ryan gets the props for making it look so damn sexy.) Coding the site took about a day and a half, minus time for Ryan to go home and sleep, and for me to cook soup.

We used the awesome, free tools at Google Code to list tasks and ideas, manage our source code, and track defects. The simple concept and excellent tools helped make this a relatively issue-free development cycle. Django, FTW!

Sunday afternoon and Monday, 2-3 November: the gathering, massaging, and jamming in of data

Pretty much finished with the code, minus subsequent bug fixes and tweaks, we started on the data.

Ryan used the Project Vote Smart API to gather information on current and historical races: the states, districts, and candidates that form the backbone of our system. He wrote Python scripts to repeatedly call the API, munge the response, and aggregate all of the races, candidates, wacky political parties, and the rest into files we could then pump into the database.

I attacked from the other side and scoured environmental groups’ websites, as well as the endorsement data provided by Project Vote Smart, to collect the endorsements we use to calculate the big numbers.

Once all the data was collected into text files, we then wrote more scripts to read those files, scrub the data of inconsistencies, poor spelling, and other weirdness, and finally fill the database.

All of this took a day and a half, far longer than we had hoped, and as much time as was necessary to build the website. I did not cook soup. We ordered in.

enviroVOTE is real, live software
Coffee, nerd sweat… smells like software. Yet, curiously minty-fresh.

Tuesday, 4 November

After attending class all day in Evanston, Ryan and I headed downtown for an evening of data input and cursing at screens.

Julia Dilday and Alexander Reed watched the AP wire all night, tracking races and gathering results and entering them into the system. I cannot express how much more difficult this was than we anticipated. Julia and Alex: thank you thank you thank you thank you.

Ryan kept the system humming through the night. He tamed the beast: keeping the site online, fixing bugs, and updating the administrative interface in an effort to improve the poor working conditions of Julia and Alex.

I ran the public relations effort: taking interviews, helping input incoming races, and getting the word out about our little project. I also gave enviroVOTE a voice. We set up a Twitter account to tell the nation about environmintiness as the results came in. (For a time, the site automatically twittered with each race result, until we realized that it was sending far more tweets than anyone would ever want to read, and turned it off.)

The aftermath

I’m told the presidential race was noteworthy, though I can’t recall who won — it was just one of nearly 500 races we recorded that night, and we weren’t watching the TV.

Since the 2nd, we’ve fixed a few bugs and we’ve slowly added the final race results as they’ve trickled in. The site is not nearly as dynamic as is was election night, but maybe we’ll have another few days free next year.