The KISS Blog

Welcome! For the most part our goal is to share with you some of the more interesting projects we work on, technical lessons learned, or otherwise things we find interesting related to technology. For now there is no ability to comment on posts though this may change in the future. Feel free to Contact Us directly with any questions or suggestions.

 
    
 

Call me old school, but nowadays it seems like everywhere you look the PHP libraries being made available are looking like Ruby gems or Python modules.  Sometimes you just need a simple class that does a one or a handful of things and you don't want to have to bring along a bunch of dependencies to get it.  Recently I needed to do some quick AWS CloudFront invalidations from a basic PHP application.  I did not need to manage our entire AWS account, just that one single function.  A quick Google search didn't turn up much other than this library.  At first glance I was in luck as it looked like just the simple tool I needed but alas, it required the HTTP_Request2 pear module.  I made it work for the moment since it was still a trivial requirement and I needed to get work done, but decided to circle back and modify that class to work with cURL that is pretty much standard nowadays with any PHP install.

And now we have our new PHP class that can be used either in a standalone PHP application or easily pulled into a CodeIngniter framework as a library: https://github.com/kissit/php-cloudfront-invalidator.

Continue reading...

In this post we will cover a method we developed years ago for a specific use case of dynamically mounting external USB hard drives that are encrypted via LUKS.  The need seemed simple enough at first.  A client was looking for an offsite backup solution.  After reviewing some options it was determined rather than the expense and complexity of a new tape system, we would leverage hard drives via an external USB enclosure to store offsite backups.  These drives would then be rotated offsite on a weekly basis just like tapes would be.  Oh, and some of the data will need to be encrypted for legal reasons.  After running through a few scenarios I quickly discovered that one of the biggest challenges was going to be reliably, and dynamically, un-mounting and re-mounting the external drives that were encrypted.  Below I will share the solution I came up with that has been working wonders ever since, and also share our script with you.

Continue reading...

One of the first things a new AWS user may notice when standing up a Linux instance is its less than friendly default host names.  Sure, in a dynamic cloud environment this may or may not be a concern to you.  However there's always something to be said for having meaningful server names.  While this may seem like a basic concept to some or a post that is late to the party we just wanted to show what we consider a nice simple way to handle this problem using the User Data feature of EC2 without manually setting each system's host name manually.

Continue reading...

In our last Ansible post I covered managing Ansible inventory, both manually and also the standard EC2 plug-in.  Today I will cover how you can use Ansible to automate the building of AWS EC2 instances to use in your inventory, including tags to use for grouping and also creating DNS entries for your new instances in Route 53. 

Before going any further, one new concept that will be introduced is the the Ansible playbook.  Essentially a playbook is a set of tasks to perform against the host(s) being targeted in a run of Ansible.  Using the various modules provided you can build playbooks to do a single simple task, include that task in other playbooks, or define a larger playbook for one single process.  Once you have your playbook, it is run using the ansible-playbook command, similar to the ansible command used in our previous posts.

Continue reading...

In this post we will cover a recent implementation for a client with a somewhat unique backup requirement.  For years, this client has been running on a custom solution we put in place leveraging Bacula, standard database dump utilities, and rsync as needed to backup various devices on their network to a dedicated backup server.  The backup server stores all backups locally for a period of time based on type, but also rolls the data to a set of LUKS encrypted offsite HDDs that are attached via a 4 bay, USB-3 enclosure.  These drives are rotated offsite weekly in a 4 week cycle.  There is nothing fancy about it, Bacula does most of the heavy lifting with some custom scripts mixed in for databases and or devices that are more easily handled outside of Bacula.  However, due to recent changes, they now needed to store what will be multiple large batches of data (~20 TB each) per week, indefinitely both on and off site.  It became clear very early that our existing toolset wasn't going to be an efficient option.

Continue reading...