The World’s Simplest AutoTweeter (in node.js)

Last month, I set up a quick little autotweeter using Node.js to help me with Repeal Day Santa Barbara. I wrote a short blurb about it before hand, here’s what actually shipped.

(Many thanks to the guys at the Santa Barbara Hacker Space for inspiring and contributing to this project.)

The plan was simple.

  1. Set up a free server at Amazon Web Services.
  2. Write a simple daemon that processes a queue of tweets, sending them to the RepealDaySB twitter account at just the right time.
  3. Write a bunch of tweets and schedule them.
  4. Run that daemon on the evening in question (December 5)

AWS

Setting up with Amazon was pretty easy. I created an instance at the “free tier” level with port 22 as the only inbound port, using an ubuntu AMI (ami-6006f309). Installing node.js and emacs was pretty easy, once I connected using Van Dyke’s SecureCRT, which handled the public key authentication like a charm. With that setup, it was pretty straightforward to start coding.  (I did need some other ports to explore some of the examples that turned out to be dead ends, but for the live service, all I needed was SSH access on port 22.)

The First Tweet

The next step was to work through the details to get node.js to send just one tweet to Twitter. A lot of the examples out there offer more functionality than I needed, using frameworks like ExpressJS to support authenticating into Twitter as any user.  But I didn’t need that. In fact, I didn’t want an interactive service. I didn’t really need a database and I didn’t need a real-time interface. I just wanted to tweet as me (well, as my RepealDaySB persona).

Twitter has pretty good support for this single-user use case:  https://dev.twitter.com/docs/auth/oauth/single-user-with-examples  If only they’d had example code for node.js…

The good news is that node-OAuth is the go-to library for OAuth on node.js and after a bit of wrangling it did the trick:

So, the first thing I did was put my secret keys into twitterkeys.js

exports.token = '3XXXXXXXXX89-3CbAPSxXXXXXXXXXXy42A9ddvQkFs96XXXXXXX';
exports.secret = 'HHXXXXXesTKZ4bLllXXXXXXXXXX8zAaU';
exports.consumerKey = "XXXXXXXbgfJRXXXXXXXX";
exports.consumerSecret = "9XXXXXXXXXQJ9U8VuoNMXXXXXXXXX";

Then, I could import that file like this:

var keys = require('./twitterkeys');

And access it like this:

var tweeter = new OAuth(
  "https://api.twitter.com/oauth/request_token",
  "https://api.twitter.com/oauth/access_token",
  keys.consumerKey,
  keys.consumerSecret,
  "1.0",
  null,
  "HMAC-SHA1”
);

I did the same thing with my tweets in tweets.js, since I thought that might be useful:

module.exports =
[ {
  status:"test1",
  timestamp: "2011-11-5"
},{
  status:"test2",
  timestamp: "2011-11-7"
}];

And to access that,

var tweets = require('./tweets.js');

The astute observer will note my brilliant plan to use a timestamp for scheduling. We’ll return to that later.

To figure out what to do with my shiny new OAuth object I looked up Twitter’s API:

POST statuses/update  Updates the authenticating user’s status, also known as tweeting. To upload an image to accompany the tweet, use POST statuses/update_with_media. For each update attempt, the update text is compared with the authenticating user’s recent tweets.Any attempt that would result in duplication will be…

Easy. So the URL to use with OAuth is

https://dev.twitter.com/docs/api/1/post/statuses/update

And here’s the code that actually combines all that into my very first tweet from node.js.

var https = require('https');

var OAuth= require('oauth').OAuth;
var keys = require('./twitterkeys');
var twitterer = new OAuth(
		   "https://api.twitter.com/oauth/request_token",
		   "https://api.twitter.com/oauth/access_token",
		   keys.consumerKey,
		   keys.ConsumerSecret,
		   "1.0",
		   null,
		   "HMAC-SHA1"
		  );

var tweets = require('./tweets.js');

var status = tweets[0].status;

var body = ({'status': status});

  // url, oauth_token, oauth_token_secret, post_body, post_content_type, callback

twitterer.post("http://api.twitter.com/1/statuses/update.json",
	       keys.token, keys.secret, body, "application/json",
	       function (error, data, response2) {
		   if(error){
		       console.log('Error: Something is wrong.\n'+JSON.stringify(error)+'\n');
		       for (i in response2) {
			       out = i + ' : ';
			       try {
				   out+=response2[i];
			       } catch(err) {}
			       out += '/n';
			       console.log(out);
			   }
		   }else{
		       console.log('Twitter status updated.\n');
		       console.log(response2+'\n');
		   }
	       });

Data Store

At first I thought I’d use a database. There are plenty that are easily accessible from node.js and I even signed up for a free service that hosted CouchDB. CouchDB is attractive for node.js work because you can basically store JSON objects directly in the database. But that also got me thinking…

Maybe a database is overkill: too much capability for what I really needed. I don’t need to update the tweets during the evening. I don’t need to support simultaneous access. I don’t need speed or scalability. That’s when I realized I was thinking like a client-side developer–the world I do most of my javascript coding in.  With node.js on the server, I could just read and write to a local file!  Turns out it’s easy.  I should have thought about that earlier, given that I already had been reading my tweets with the @require command, but I hadn’t thought about being able to WRITE to the file to keep track of what had been tweeted.

Here’s how you do it. First, set up the path, using the __dirname variable to access the local directory:

var path = __dirname+"/tweets.js";

Then, to read the file:

var fs = require('fs');
fs.readFile(path,"utf8", function(err,data) {
  if (err) throw err;
    tweets = JSON.parse(data);
});

And to write the file:

fs.writeFile(path,JSON.stringify(tweets,null,4),function(err) {
  if(err) throw err;
    console.log("It's saved");
});

Date & Time

Now, about that timestamp.  I had to represent when I wanted to tweet and having been here before, I knew it could be tricky to make sure the server agrees on the timezone. Javascript has a great Date() object which can parse ISO-8601 formatted dates (e.g., 2011-12-05T10:00-08:00) , so I tried using that. Since the timezone is explicit in ISO-8601, it doesn’t matter what timezone the server is in, as long as the comparison uses a fully qualified timestamp. It took a bit of trial and error because the parser is pretty strict, but eventually, I got it.  However, that raw timestamp isn’t that easy to work with, so I used an old trick that I ported from Excel into Google Docs. Put the data in a spreadsheet for editing, and use columns to format it into the right JSON. Then you can cut & paste the rows into a text editor, delete all the tabs, and get the format you need. Here’s the doc I actually used.

It worked like magic. I got to use spreadsheet math to track the number of characters remaining in the tweet plus to schedule the dates. Things like

=B9+25/24/60

set a time that’s 25 minutes after cell B9, which made scheduling our tweets a breeze. With a bit of wrangling, I was able to get the easy-to-edit date and tweet on the left translated into the proper JSON & ISO-8601 format on the right.

After deleting the tabs, here’s what a resulting line looks like:

{"status":"Good Morning, Santa Barbara!  Happy Repeal Day, everybody! 78 Years ago, we lifted the chains of Prohibition, ratifying the 21st Amendment!","timestamp":"2011-12-05 T 10:00 -08:00"},

Add a bracket or two and clip the extra comma, and you’ve got a nice JSON array suitable for our javascript code.

Google Docs was especially nice because it made collaborating on the tweets super easy. My business partner and I had a great way to review and revise the set of tweets as we got ready for the main event.

Timing

The next trick was figuring out how to run the code so that it hummed politely along and sent the tweets when necessary. Since this was the only important process on my machine, I could’ve ran a never-ending loop constantly checking the time, but that seemed inelegant, and after all, the point was to learn how to use node.js properly. What I wanted was to start the daemon and forget about it, and let it sleep & wake up just when it needs to send a tweet.

So, every once in a while, the daemon wakes up, builds a list of tweets that need to be tweeted (because the current time is after their timestamp), tweets them, and marks them as tweeted. Also, we keep track of the earliest timestamp in the rest of the tweets, so we can sleep the right amount of time.

Here’s how:

var now = new Date();
var next_time;
var tweet_now = new Array();
for(t in tweets) {
   if(tweets[t].tweeted)
     continue;
   time = new Date(tweets[t].timestamp);
   if(time < now ) {
     tweet_now.push(tweets[t].status);
     tweets[t].tweeted = true;
   } else if (!next_time || // either this is the first pass
       time < next_time) { // or this is a sooner timestamp than recorded
     next_time = time;
     console.log("setting next time = "+next_time);
}

And then, just a bit of house keeping. We tweet, save to file, and reset the timer.  The nice thing about saving to file is that in case we have to kill the daemon, when we loading the file at the start, we’ll know which tweets are already sent.

if(tweet_now.length) {
   tweet(tweet_now);
}
save_tweets();
if(next_time) {
  delay = next_time.getTime()-now.getTime();
  setTimeout(run_tweets,delay);
   console.log("Delay: "+delay);
} else {
   console.log("Done.");
}

And that’s pretty much it.  It’s quick and dirty, so I just dump the errors to console, which is great for debugging, but it may not be the best strategy. More on that later.

Here’s the complete file that we actually used the night of December 5, Repeal Day.

var fs = require('fs');
var OAuth= require('oauth').OAuth;
var keys = require('./twitterkeys');
var path = __dirname+"/tweets.js";
var tweets;
var auto_tweet = function() {
  console.log("auto_tweet");
  fs.readFile(path,"utf8", function(err,data) {
    if (err) throw err;
    tweets = JSON.parse(data);
    // tweets are only loaded once. If you change the file, restart
         run_tweets();
  });
};
var run_tweets = function() {
  console.log("run_tweets");
  //find all the tweets that happen before "now"
  // saving the soonest timestamp that is before "now"
  //mark them as "tweeted"
  //tweet them
  //save to file
  //reschedule
  var now = new Date();
  var next_time;
  // console.log("first next_time = " + next_time.toUTCString());
  var tweet_now = new Array();
  for(t in tweets) {
    if(tweets[t].tweeted)
      continue;
    time = new Date(tweets[t].timestamp);
    if(time < now ) {
      tweet_now.push(tweets[t].status);
      tweets[t].tweeted = true;
    } else if (!next_time || // either this is the first pass
      time < next_time) { // or this is a sooner timestamp than recorded
      next_time = time;
      console.log("setting next time = "+next_time);
    }
  }
  if(tweet_now.length) {
    tweet(tweet_now);
  }
  save_tweets();
  if(next_time) {
    delay = next_time.getTime()-now.getTime();
    setTimeout(run_tweets,delay);
    console.log("Delay: "+delay);
  } else {
    console.log("Done.");
  }
};
var save_tweets = function() {
  fs.writeFile(path,JSON.stringify(tweets,null,4),function(err) {
    if(err) throw err;
    console.log("It's saved");
  });
};
var tweet = function(tweets) {
   var tweeter = new OAuth(
    "https://api.twitter.com/oauth/request_token",
    "https://api.twitter.com/oauth/access_token",
    keys.consumerKey,
    keys.consumerSecret,
    "1.0",
    null,
    "HMAC-SHA1"
  );
  var body;
  for(t in tweets) {
    console.log("tweeting : "+tweets[t]);
    body = ({'status': tweets[t]});
    tweeter.post("http://api.twitter.com/1/statuses/update.json",
      keys.token, keys.secret, body, "application/json",
      function (error, data, response2) {
        if(error){
          console.log('Error: Something is wrong.\n'+JSON.stringify(error)+'\n');
        } else {
          console.log('Twitter status updated.\n');
        }
      });
    }
  }
// Now start it up
auto_tweet();

 Success

It worked, mostly. And it shipped on time. That was awesome. I made it as simple as possible. If I could have, I would’ve made it simpler. But that isn’t to say there weren’t problems.

Challenges

1. Too many libraries, too much functionality.

It took a long time to wade through the blog posts and tutorials on how to use node.js, OAuth, and Twitter. It’s great that there are so many approaches, many well documented. But I didn’t need all that. Short, and simple.  Maybe you’re looking for that too. If so, I hope it was easy to find this post.

2. Operations support

As I was out on the Repeal Day pub crawl, all I had was my Android phone to keep on top of things. Surprisingly, I was able to get an SSH client working, even using the public key authentication.  Unfortunately, the text was TINY.  And I couldn’t type the “|” character, making it impossible to use some of my favorite commands.  Apparently, that’s a well-known problem with my particular phone.  Also, the batteries got sucked dry REAL fast.  I had to resort to keeping the phone off most of the time. Even then, it died well before the tweets ran their course.

3. Twitter won’t send your own tweets to your phone

This was particularly annoying. Since my app was sending my tweets, but Twitter wouldn’t echo them to my phone, I had to keep checking, either manually or by asking my partner if the tweet went out.

Unresolved Issues

1. Process management was non-existent

So, not having dealt with server processes for a few years, I hadn’t fully thought through the fact that closing the session would kill my daemon. Next time I’ll try I’ll try using Forever.

2. Server may have been unstable

In hindsight, I think the server crashed on me and I have no idea why. I should’ve piped the error codes into a log file, which, presumably could be done with a proper process handling approach.  Killing and rerunning it restored the server, but there was something fishy going on that I never got to the bottom of.

3. Wacky characters didn’t paste well from Docs to JSON

Undoubtedly this was a UNICODE encoding issue, and it showed up in words that had tildes or accents. Which were quite a few given the exotic ingredients in some of the cocktails for Repeal Day. The best way around this would be to find a way to read from Google Docs directly. The second would be perhaps to build some sort of interface instead of using Google Docs. Alternatively, I could debug the copy & paste process and see if I could isolate where the problem happened. Maybe it was in my SSH terminal, pasting into emacs, which might suggest that copy & pasting into a real editor locally and sending the file to the server might avoid the problem.

 And Done

That’s it.  Perhaps it was a bit of overkill. There are plenty of free auto-tweeting services out there. But in addition to my doubts about how well they might work, I also thought it was a small enough use case to use to learn node.js. In that, it was a huge success.

Let me know if this was useful for you. I’d love to hear from fellow coders if this helped you along in any way.

This entry was posted in AutoTweeter, coding and tagged , , , , , , , , , , . Bookmark the permalink.