glitched.org /up

glitched.org is hosted on Amazon’s S3 service

So, a little while ago, I moved glitched.org and modernsoundmastering.com from a traditional web host (Dreamhost), to Amazon’s S3 cloud storage service.

Why?  First of all, the pricing.  I haven’t paid a penny yet, but the cost projections for this month’s usage amounts to about…$0.07.  Yes, seven cents.  A single month of Dreamhost cost $9.95.  At this rate, even if I paid just $10 per year, that’s still ten times the cost of a year on S3.  (Note: since I use lots of “cloud” services, the bandwidth and storage usage for me is minimal.)

Second, I discovered that I didn’t have a need for a fully-dynamic site, run on something like WordPress.  Yeah, WP is great, but I don’t use many of the dynamic features, like comments or galleries or whatever.  I don’t update the blog every day, so that’s another reason why a static site is better for me.

Third, you can’t say enough about security.  It’s hard to keep up with all of the software updates.  Sometimes, your favorite plugin is incompatible with the latest version of WP and that can be vexing.

Finally, the speed of a static site can’t be taken for granted.  There are some database-driven sites that take 15-30 seconds to load, which is not something I like to endure.

How?  It wasn’t too, too difficult, but it took some time and research to figure out.  Read below to find out more.

I started by googling, of course, and came upon a few helpful pages and subsequently, a method that works for me:
1) http://is.gd/4SzWvg — In February of 2011, Amazon enabled the “website” feature of the S3 product.  At that point, I wasn’t sure of the benefits and if it would be too difficult, but this page, “Cloud Apps Experts,” really convinced me.  This guy had three sites, one of which was doing 5.1GB outbound (above my usage), that came to way-less-than-a-dollar per month, on Amazon.  The article demonstrated that even if the most popular site had ten times the amount of traffic, the cost would still only be the price of one month on a standard web host.  I was sold.

The article describes what one has to do to configure S3 and his domain, in order for them to work together.  The Amazon side is a breeze, but your mileage may vary, depending on your domain host and its configuration software.  Dreamhost’s is ok, but not as intuitive as some others.  Still, when I did have a problem, the support personnel helped me out, quickly.

2) http://www.apachefriends.org/en/xampp.html - XAMPP — a portable, local web server on which I would develop the site.  A local WP installation has several advantages, not the least of which is speed.  Also, it’s a great testing ground.  If I ever messed anything up, my website visitors wouldn’t have to deal with any down-time.

Once you instantiate the MySQL and Apache servers, you can install any web application you want, as if the servers were located on the internet.  I have three WP installations for each of my sites and I just log in, update or create a new post, then publish it, as normal.  Next, I would need a way to get static files from my dynamic blogs.

3) http://mossiso.com/2009/07/20/convert-wp-to-static-html-part-2.html – This site describes a method for rendering a dynamic WP site to static pages, using wget, a command line program for retrieving files via HTTP, FTP, and other protocols.  The commands are simple to understand and could easily be integrated into a script, but I looked at it as a hassle.  Yes, it did work as intended, but some files could not be retrieved, such as the RSS feed and sitemap.  I would have to look for an all-inclusive way to get static files.

4) http://wordpress.org/extend/plugins/really-static/ – “Really Static,” an essential WP plugin that, well, renders static pages from your dynamic WP installation, just as wget does, but it also allows you to transform any dynamic page, via an “advanced” settings dialog.  It took a few tries to properly configure the plugin because it was difficult trying to decipher the author’s poor English, but when it did finally work, it did what it was supposed to do.

Method.  With all these tools, I was about 90% there, but still not completely happy with the solution.  Although “Really Static” is great, it failed to change all the references to the local site URL (http://127.0.0.1) within the static files, resulting in 404 errors.  For instance, it seems to ignore URLs in widgets, such as the archives and anything in between <script> tags.

I also discovered that certain important theme files (the CSS) were not copied to the static files destination folder.  This wasn’t that big of a deal; I just had to remember to copy those files over whenever I changed the styling of the site, which is rarely.

So, how would I change the references to the local site, within the hundreds of .html and .xml files?  Text Crawler!

I was really happy to find this application.  It’s easy to use.  Setting up a batch file to find and replace all references of the old site with thew new one (http://glitched.org/up) took like two minutes.  This operation is now accomplished in one button press.

Conclusion.  After all the research and configuration, the process of creating a new post and uploading it for display on S3 is a pretty streamlined process.  It takes about three or four steps, which is two or three more than if I were using a standard, dynamic WP installation, but as I said, it works for me.  For someone who updates their blog every couple of days, the process described above would get pretty tedious.  Still, if you’re more motivated by cost concerns, as I was, it might make sense to convert your dynamic site into a static one, hosted on Amazon’s S3 service.

 

category: news, site