Two years ago, I joined President Obama’s re-election campaign as one of the first engineers in the Technology department. I worked hard, learned an extraordinary amount from a host of fantastic coworkers, and was privileged to get to apply my craft to help re-elect the President.
After helping start the Narwhal and Dashboard projects during my first few months, I transitioned to lead the Analytics Technology team with Chris Wegrzyn. The Analytics department grew to 54 people, busy with managing polling, creating and updating statistical models, and analyzing any and all data to advise campaign leadership across all departments on program strategy and efficacy.
Our team of nine Analytics engineers created, curated, and maintained a 50 TB analytics database, uniting all the campaign’s data into one place – letting us create, coordinate, and analyze holistic, data-driven programs.
Suddenly we could do things like notice a supporter had requested a mail-in ballot and assist them via email to ensure the ballot was cast and counted. We could analyze merchandise purchased via the mailing list, events, and the online store. We created a TV-ad purchasing optimizer that got us 15% more persuadable viewers per dollar.
We created a tool (“Stork”) that connected our analytics database with a few key vendor APIs, Google Spreadsheets, mapping, and basic data processing features — and empowered analysts and state and HQ data staff to implement their own automated, data-driven ideas for helping re-elect the President. We released the tool to users when it had a single function, and let user feedback set the agenda for the next 23. We often added new features within hours of users’ requests. Its functions were composable, and served as the basis for several of our own even higher-level tools.
Our process in Analytics Technology was partly agile, but mostly just keep-it-simple and get-it-done.
Our team didn’t have to be web-scale, we just had to be Big Data scale; instead of millions of web requests, we had 6-billion-row tables to join and keep synced, and
200 querying users (and dozens of apps) to keep happy. And mostly, we needed to move quickly to take as many creative (yet often simple, common sense) ideas and help make them happen while they could still have an impact. I think a lot of us wished we’d had just a couple more months, and oh what we could have created!
We used SQL (oh, did we use SQL), Python, Ruby, Java, Hadoop, Postgres, Vertica, cron, git, ElasticSearch, EC2, DynamoDB, S3, SES, and much more. We were generalists, who built and maintained our tools collectively, who seamlessly multitasked on data ETL, Rails apps, database administration, GIS, data-triggered emailers, Hadoop jobs, and much more.
Some personal highlights of the 17 months included receiving extremely kind letters from state staff thanking the team for our work and our tools, giving a man the Heimlich maneuver at State and Randolph on the way into work, and shaking President Obama’s hand.