I prefer developing on remote server, since it ensures greater fidelity to the production environment plus awesome portability. I just need a terminal and my .pem key to logon to my servers from anywhere, any laptop.
Recently, I had to setup a full workflow for automated deployments for the whole tech team, who use their own remote servers. Following is what I did to get the environment up and running.
developers = 3
Development = remote server (ensures a similar development environment to production plus its always accessible)
- Easy remote development
- Easy version control of personal development
- Easy Code review and helping each other
- Easy deployment to testing and rolling back
- Easy deployment to live once tested and rolling back Lets look at each of this steps one by one.
EASY REMOTE DEVELOPMENT
- Install sublime text 3 from
- Install package manager
If you have it already installed, skip to next step.
- Install SFTP plugin:
Open Sublime Text
cmd + shift + pon your mac
Install Packagein the popup, press
enterwhen install package gets highlighted
SFTPand let the plugin appear in the list. Press
enteronce it gets highlighted
- Configure SFTP plugin with your AWS SSH settings
In sublime, make sure you have sidebar displayed and the project folder visible in the sidebar
Right-click the folder in sidebar and navigate to
SFTP/FTP > Add Remote mapping
Configure the file thus opened with your server details
Save the file and try syncing a test file both ways
Tip: Use keyboard shortcut
cmd + control + u + yto sync local file to remote. If you navigate to the file via browser you will be able to see the changes
Great! You have a remote server, whose files you can edit from your local machine and see the changes in real time.
If you have a huge codebase, you may want to map the drive to your local system. I have tested this method only on MAC OSX Mavericks. For linux, I will add instructions later. For mapping AWS EC2 instance to Mavericks, its very straightforward:
- First setup SSH config by navigating to
~/.ssh/config. If there is no config file, then create a file and add the following:
Host your_ec2_host HostName your_ec2_hostname User your_ec2_instance_username_like_ubuntu IdentityFile ~/path/to/your_ec2_pem.pem
- Download and install osxfuse
While installing, make sure you tick the checkbox of maclayer compatibility. Else you will get the dreaded library not found while mounting using macfusion (we will be installing this in next step)
- Download and install macfusion
If you followed step 1, then you will be able to configure macfusion very easily.
If the mount fails, press
command + L try mounting again and observe the error. I got a library error when I tried the first time. It was due to missing compatibility libraries because I didn’t follow the step 2 instructions properly. I uninstalled osxfuse and reinstalled it with compatibility layer selected. Perfect! No library error this time on mounting. But I got a new error: mount timeout. I fixed this by going to preferences of macfusion and increasing the mount timeout to 30 secs. With this 2 fixes, Mounted perfectly!
- With the folder mounted, you don’t have to use SFTP to upload files anymore. Any changes made will be immediately live. This is as close as you can get to getting a remote server on your local machine. The downside is, file access can be slow if you are not a good speed. I personally prefer the SFTP route, though mount route is more portable.
EASY LOCAL VERSION CONTROL, CODE REVIEW and DEPLOYMENT
NOTE: Local machine = remote server created above. I am following SFTP route.
1. Install git on your local machine
2. Since we want code review, we will use bitbucket as a middleman
3. Create account on bitbucket
4. On your local machine, use SSH-KEYGEN to create SSH keys to access bitbucket repos. I suggest using multiple key identities and using SSH ALIAS to make it easy and error free. Follow instructions here: multiple SSH identities
5. Upload the public SSH key to bitbucket. Follow instructions here: public key on bitbucket
6. On your local machine, navigate to the project folder (where you have git init’ed) and add bitbucket repo as origin. Use the command:
$ git add origin *ssh url of your git repo*
- We will be using git flow, workflow illustrated here: gitflow
- Now, make changes in you local machine, commit and then push your branches to your bitbucket repo. Once you are ready with your feature, push your feature branch to central repo.
- File a pull request on bitbucket. The repo admin will check the notification, and will pull the request in his or test bitbucket repo.
To make all such bitbucket actions reflect on our servers, we will be using post-update service hooks explained here: Manage bitbucket hooks. We will be using simple POST hooks explained here: POST Hook.
We will be using these service hooks to update our test, staging, live, local, basically any servers. No direct touching of files on the server.
- If all is good then the admin accepts the pull request and merges the feature branch to develop and master branch in bitbucket.
- This triggers an update in the bare hub repo on live server. The hub repo on receiving the updates, uses a post-update hook to update the prime repo. And the site is updated. The concept of Hub and prime is borrowed from here: Hub-Prime deployment. Except, instead of directly talking to the remote server, I have introduced bitbucket’s service hooks in between to enable pull requests and code reviews.
Since most of my approach is not original and is largely a cocktail of various approaches mixed together, I have intentionally avoided rewriting the fine tutorials written by others and have directly linked to them. In case, you face any difficulty, feel free to leave a comment and I will try to help out.
I am approachable on twitter @talvinder as well.