skip navigation

Here you will find ideas and code straight from the Software Development Team at SportsEngine. Our focus is on building great software products for the world of youth and amateur sports. We are fortunate to be able to combine our love of sports with our passion for writing code.

The SportsEngine application originated in 2006 as a single Ruby on Rails 1.2 application. Today the SportsEngine Platform is composed of more than 20 applications built on Rails and Node.js, forming a service oriented architecture that is poised to scale for the future.

About Us
Home

ElasticSearch on OpsWorks

09/19/2013, 2:15pm CDT
By Ian Ehlert

A quick guide with some open sourced chef recipes to get ElasticSearch up and running on AWS OpsWorks.

TL;DR

Follow the steps outlined in our cookbook README to get an ElasticSearch cluster running in your OpsWorks stack.

About

ElasticSearch is a popular search server based on Lucene. ElasticSearch has many uses, but one of the most common use cases is indexing and searching text. Couple it with a great Ruby gem like Tire and it becomes even more powerful and easier to use.

Since it's so easy to develop against, we wanted a quick way to build up and manage a cluster. We've been playing around with OpsWorks for a few months, and it seemed to be the perfect tool to get it up and running with.

How To

The first step is to pick the OpsWorks stack that you want to add the ElasticSearch cluster to, or just create a new one. There are two requirements for this to work.

  1. Use the AmazonLinux AMI
  2. Use Chef 0.9 (these recipes haven't been tested with Chef 11)

Next you need to create a new layer. The name of it doesn't matter, but the short name needs to be elasticsearch in order for the recipes to function correctly.

You'll need to add the following Custom Chef Recipes to the layer for the appropriate events:

  • Setup: elasticsearch::install elasticsearch::packages
  • Configure: elasticsearch

You can add our elasticsearch cookbook to your own set of cookbooks by using git submodules.

An EBS volume mounted at /data is needed to store the indexed data. Make it whatever size seems appropriate for the data that you are indexing.

Lastly, you'll need to set up a security group to allow inbound TCP packets on port 9200 from any outside servers that will be querying against your ElasticSearch cluster.

I also recommend attaching an Elastic IP to one of your nodes so you don't have to worry about hostnames/ips changing in the future.

All that's left is to boot up some instances for this layer. As you add instances, they will automatically be added to the cluster.

That's it! Pull requests welcome!

Tag(s): Home  DevOps  High Availability