Skip navigation

ActiveRecord is great but it can be a little scary to think about using it without Rails. Here's a quick example of using it with Sinatra.


The first thing we'll want to do is get the ActiveRecord gem by itself (throughout this tutorial I'm assuming you're using Bundler to manage your gems):


gem 'activerecord', :require => 'active_record'


Next up, in the configure block of our Sinatra app, we'll set up the AR connection to our database:


ActiveRecord::Base.establish_connection(:adapter => 'sqlite3', :database => 'db/development.db')


Of course, if you want to stay flexible, you can move your database config into a YAML file (laid out the same as the Rails standard config/database.yml file) and pull the connection details from that:


db_config = YAML::load(,'config','database.yml')))[Sinatra::Application.environment]

Now that AR is ready to go we can include our models. I like to move my models into separate files, similar to Rails's app/models, so I tell Sinatra to include every file in that directory:


Dir.glob('./app/models/*').each { |r| require r }


Believe it or not, that's really all you need to do to start using AR in your Sinatra application! The only missing link is database migrations. AR comes with a huge Rake file full of tasks like db:migrate, but unfortunately those rely on being inside the shell of Rails (they look for environment variables like RAILS_ENV and the base Rails object all over the place). Here's a very simple piece of code that will enable the standard db:migrate task:


require 'bundler'

desc "Migrate the database through scripts in db/migrate."
task :migrate do
  ActiveRecord::Base.establish_connection(YAML.load('config','database.yml')))[ENV['ENV'] ? ENV['ENV'] : 'development'])


This assumes that your migration files are in db/migrate. When you want to migrate in an environment other than development just set the ENV variable:


rake db:migrate ENV=production


That's it! There are a couple of different database ORM libraries out there besides ActiveRecord (like Sequel and Datamapper) but AR is the most popular today and it's easy to use it in all of your web apps, whether they're Rails or not.

10,433 Views 0 Comments Permalink Tags: ruby, sinatra, rails, database, activerecord

Active Local's Features

Posted by JeremyGThomas Feb 18, 2011

I thought I'd share some of the awesome features we've baked into Active Local with you.  While we designed the site to make it easy to browse through activities happening in your area, we've also made it easy for you to contribute activities too.  I put a little screencast together showing how to do this:

  Unable to display content. Adobe Flash is required.

Check out Active Local at

1,688 Views 0 Comments Permalink Tags: activelocal, screencast

Amazon AWS is awesome. It makes it stupidly simple to create and deploy complex, load balanced applications with servers in multiple locations around the country (EC2 and ELB), redundant database servers with failover (RDS), unlimited storage guaranteed for at least 10 million years (S3)... the list goes on. But one thing that wasn't so easy, until recently, was security. How do you give your developers and servers access to only the parts of AWS that they needed? That's where Identity and Access Management (IAM) comes in.


With IAM you can create users and policies which define what products (technically, what API calls) a user is allowed to make. With Active Trainer 2.0, for example, all the developers have full access to our AWS account, but the application servers only have access to get and put objects into S3. This way if one of the servers was ever compromised and the attacker got ahold of the access key and secret key (which Amazon requires to make API calls), the worst damage they could do would be to put a bunch of objects into our S3 bucket. Not the end of the world.


Setting up IAM is relatively easy. Amazon doesn't have a GUI for using IAM just yet, so you have to use command line tools. You can get those here:  Once you have those installed you'll need to set up a couple environment variables so that the tools have access to your existing access key and secret key. Now the fun begins.


First we're going to create a group for our admins. We'll put the access rules on the group itself so that any new users we put into this group automatically inherit those permissions. To create a group:


iam-groupcreate -g admins


Now we'll create a user and put them into our new group:


iam-usercreate -u johndoe -g admins -k -v


This command will return that user's new access key and secret key. Write these down and give them to the new user when they're ready to start using the tools.


So we've got a new user but by default IAM enforces a "deny all" policy, so they can't do anything yet. Now we'll give the admins group a policy. Policies can be a little confusing at first and to help you generate them Amazon has put together a neat utility called the AWS Policy Generator. You can learn more about this on the site, but for this blog post I'm going to give you a couple policies to get started.


Admins will be allowed to do anything, and the policy that defines that looks like this:




Save this in a text file named 'policy.txt' in the same directory where you're making your IAM calls. Here's the command to attach this policy to the admins group:


iam-groupuploadpolicy -g amins -p admins_group_policy -f policy.txt


In this case we gave the policy a name of "admins_group_policy."  You can set this to whatever you want.


So now our 'johndoe' user can make API calls to Amazon. And if, one day, johndoe leaves the company, you simply remove his user from the admins group and all of his access goes away. You can read more about this in the IAM docs.


Here's a policy that we use on our application servers:


  "Statement": [
      "Sid": "Stmt1297725053392",
      "Action": [
      "Effect": "Allow",
      "Resource": ["arn:aws:s3:::*","arn:aws:s3:::**"]


This policy only allows calls to the given API methods (listed under "Action") and only allows them to be called on the listed buckets and objects (listed under "Resource"). You can get even more fine-grained than this and do stuff like only allowing access to a single API call from a given IP address during certain times of the day. Policies are really powerful.


IAM is awesome. It gives you some great security controls on your AWS services and keeps things safe for users and applications that need to use those services. If you're not using it already, what are you waiting for?

2,210 Views 0 Comments Permalink Tags: security, amazon, aws, iam

Ruby on Rails comes with a bunch of great rake tasks to help you work with your application. One of them is rake stats which counts the lines of code in your controllers, models and tests. It's a pretty neat utility that we use all the time to make the Java developers jealous.


I've been working on a little Sinatra application and wanted to use the same functionality. The rake stats command used the CodeStatistics class from one of the core Rails packages called railties (that's "rail-ties" as in the wooden slats that hold up train tracks, not "rail-tees" like t-shirts about trains). Now, the good Ruby developer in me says to simply require 'railties/code_statistics' from in my Sinatra application. However, I also realize that my Sinatra application has nothing to do with Rails and never will. So I did the unthinkable: copy-paste. In this case I don't think it's that bad: code_statistics probably hasn't changed much since Rails 1.0 and it's not something that I care about keeping in sync with the master development branch of Rails. Don't sweat it.


I put code_statistics.rb in a new directory in my Sinatra app called vendor which is a Rails convention for code from third-parties. Normally I would just put this code into lib, but in this case rake stats includes the lines of code in any files in lib, and I didn't want to throw off my numbers. In my Rakefile I added a new task (also copied from the Rails default Rakefile with a couple modifications):

desc "Report code statistics" task :stats do   require './vendor/code_statistics'     STATS_DIRECTORIES = [     %w(Controllers        app/controllers),     %w(Helpers            app/helpers),     %w(Models             app/models),     %w(Libraries          lib/),     %w(Migrations         db/migrations),     %w(Views              app/views)   ].collect { |name, dir| [ name, "./#{dir}" ] }.select { |name, dir| }*STATS_DIRECTORIES).to_s end


The modifications in this case are the directories that should be scanned for code lines. By default Rails doesn't include your views as counting towards your totals, but I wanted them to. I also have lots of migrations and would like to know how those contribute as well.  If you have any directories in addition to these just add them to the STATS_DIRECTORIES constant. I also wanted to count the number of comments in my code. I'm a big proponent of commenting and I'd like to know what ratio of my code is used to tell my future self how everything works. You can get a copy of my code_statistics.rb file (with comment-counting mods) here:


And that's it! Now just rake stats from the root of your Sinatra app and you'll see something like this:


+----------------------+-------+-------+----------+---------+---------+-----+-------+ | Name                 | Lines |   LOC | Comments | Classes | Methods | M/C | LOC/M | +----------------------+-------+-------+----------+---------+---------+-----+-------+ | Controllers          |    65 |    50 |        8 |       0 |       0 |   0 |     0 | | Helpers              |    41 |    32 |        0 |       0 |       7 |   0 |     2 | | Models               |    12 |    10 |        0 |       2 |       1 |   0 |     8 | | Libraries            |     8 |     6 |        1 |       2 |       1 |   0 |     4 | | Migrations           |    78 |    71 |        4 |       0 |       0 |   0 |     0 | | Views                |    45 |    39 |        0 |       0 |       0 |   0 |     0 | +----------------------+-------+-------+----------+---------+---------+-----+-------+ | Total                |   249 |   208 |       13 |       4 |       9 |   2 |    21 | +----------------------+-------+-------+----------+---------+---------+-----+-------+   Code LOC: 208     Test LOC: 0     Code to Test Ratio: 1:0.0


852 Views 0 Comments Permalink Tags: ruby, sinatra, rails, rake


We had a mandate in Q3, 2010, to conceptualize a product that  showcased what a regionally-focused could be.  After myriad  brainstorming sessions, several user research trips, and executive coaxing we  landed on Active Local.



Life’s Purpose Local has one mission in life, and that is to  connect active people with the things active people do. Local is  tightly integrated with, which has an awesome listing of activities  to register for. All of those great activities have made their way to  Local, too.


But we want Local to be about more than just the  big events people register for. We're striving to be the definitive guide to  activities happenning in your neighborhood - activities like morning  rides, yoga in the park, afternoon runs and nearby hiking treks. We did our  research and found that local bike shops and running shoe stores, the running  club down the street, your favorite yoga instructor, or the local climbing store  all put on an incredible number of free activities right where you live. So  we're working in conjunction with these activity hosts to build up our inventory  of things to do in the Bay Area.


In this way Local can be used not only to register  for the Bay to Breakers, for example, but also to find a 6am run to join on  Monday mornings.


We're also promoting local activity hosts - small shops and  clubs - by making it easy for you to find your favorite bike shop and the  morning rides it puts on, for example. The most active clubs and businesses are  featured on our homepage. And you'll also find them listed on Near You map.


Public Beta Release Local is targeted towards people in the San  Francisco Bay Area.  And starting now, users from that region who visit will see a ribbon encouraging them to visit Local.


Check it out at

1,226 Views 0 Comments Permalink Tags:, activelocal

Active Realtime_1296838692026.png A few months ago Amazon contacted me about our usage of Amazon Simple Notification Service (SNS), a messaging platform that enables topic-based messaging between applications. At the time we were one of the biggest consumers of SNS, and they were curious about what we were doing with it.  I pointed them to and explained our realtime architecture to them.  Intrigued, they showed a few of their engineers and later decided to write a whitepaper about Realtime and SNS.


Now, finally, that whitepaper is available on  Check it out at  Here's an excerpt: was looking for a way to analyze a user’s click-stream in  near real-time to deliver pertinent trending information in a timely  manner. One of the fundamental ways that enhances user  experience on its website is by understanding and anticipating user  needs- surfacing relevant content dynamically to users whenever  possible. This is reflected in the “Popular Near You” feature on the  homepage, or the “Events Near You” feature on the channel pages, such as


I've gotta give props to Kevin over at Amazon for driving this whole thing.  And of course, I've also gotta give props to the two guys who built Realtime, Brian Levine and Rob Cameron!

1,292 Views 0 Comments Permalink Tags:, cloud_computing, amazon, realtime