This blog has reported on the rise in big data and how many businesses across industries have used it as a way to improve operations and work more efficiently and effectively. But, a recent Time article has outlined the way President Barack Obama was able to, essentially, base his 2012 re-election campaign on the aggregation, analysis and dissemination of big data and turn it into actionable results.
“We were going to measure every single thing in this campaign,” Jim Messina, President Obama’s campaign manager, told the source.
Immediately after taking the position as campaign manager two years ago, Messina hired an analytics department that was five times larger than the one President Obama used in 2008. Rayid Ghani, who used to analyze big data to optimize supermarket sales promotions, spearheaded the technical department.
The campaigns data analysis was huge. During every day of October, engineers ran the election 66,000 times over night, preparing for every possible outcome. When managers arrived the next morning, the computer spat out the chances of the candidate winning in every state. The data used during these tests was updated constantly.
Furthermore, while campaign workers were able to model exactly which type of person was likely to donate or what public figure they may be most swayed by (Obama held major charity events with George Clooney and Sarah Jessica Parker), secrecy was the utmost priority. So much so that the analytics department worked in a separate room from the rest of the campaign and most tasks were given code names. One senior aide described the data sets as their “nuclear codes.”
It’s clear that one of the advantages to President Obama’s victory on November 6 may have been from the massive amounts of information it was not only able to possess, but also process and articulate to major decision makers. For small businesses, this can be accomplished by consulting FileMaker developers that can create custom database software molded to fit the needs of specific businesses.