One Hundred Hellos - Blog

Editing Markdown Static Sites - 19 February 2014 - website

Still searching for that frictionless workflow

Still searching for that frictionless workflow.

This post was edited with prose.io

Set Exif Dates on Scanned Image - 23 September 2012 - images

While scanning in some old pictures, I found that setting their date metadata would be useful

Scanned in a few old pictures recently, and found that some of the tools I had written for organizing and synching them would benefit from (at least approximate) date meta-data.

I found a nice universal tool to modify EXIF data: ExifToool by Phil harvey. It even has a mac installable dmg download.

Here's a nice shortcut for setting all 3 common date fields in a JPEG image.

# show original dates (there are none!)
$ exiftool -alldates test.jpg

# set the dates
$ exiftool "-alldates=1968:05:16 05:00:00" test.jpg

# show updated dates (there are 3!)
$ exiftool -alldates test.jpg
Date/Time Original              : 1968:05:16 00:00:00
Create Date                     : 1968:05:16 00:00:00
Modify Date                     : 1968:05:16 00:00:00

Javascript Beauty Linting - 02 June 2012 - javascript

Discussing a workflow for formatting and linting JavaScript

I write a lot of JavaScript, I want it to look great, and be consistent.

Get to the point

tl;dr

There is one feature I absolutely count on when developping: Code Formatting. This is the critical feature on which I choose my editor. It is why I loved Netbeans, which I since abandoned because it feels too heavy. Now I have moved on to Textmate, but I have a nagging feeling that the platform lock-in may yet bring me to Sublime Text 2 soon. (I have failed at Vim too many times to count...).

Beautifying

So TextMate has a decent built-in code formatter, and even a Javascript specific one, based on a 2007 php version of Einars Lielmanis's JSBeatifier. That beautifier is still one of finest for javascript, but the implementation shipped in TextMate needs an update. There are lots of third party bundles which integrate different flavors of this beautifier library into Textmate.

Linting

Next is linting, which is a code quality tool. For our puposes there are two choices JSLint and JSHint.

The easy part is accepting that any set of enforced quality policies is better that NONE. The harder part is to refine the configuration choices to maximize quality and safety while preserving enough flexibility and convinience.

Integration

The crux of the choices to make here depend on the workflow requirements.

Ideal Conditions:

For my current Editor, Textmate, Javascript Tools, by Thomas Aylott aka subtleGradient, which I installed with Get Bundles.

There is also JSTools by Adam Hope, but it does not seeem to be as current. I also considered JSLintmate for jslint/jshint, but wanted a more integrated solution (that I didn;t write myself)

Now for configuration, and integration...

Relative Importance of Minifaction and Compression - 01 April 2012 - javascript

While having a discussion recently on the tradeoffs of different data representations of our json data for an open data project, I thought it important to put this aspect of optimization into context.

While having a discussion recently on the tradeoffs of different data representations of our json data for an open data project, I thought it important to put this aspect of optimization into context.

Minification is the process by which javascript (json) text representation is minimized by removing all unnecessary characters from the source, like white-space. Sometimes it involves safe-rewiting variable names for shorter ones, etc.

The primary objective of minification is to reduce the time required to transport the javascript, by reducing its size.

But there is another factor which affects the transported scripts' size, usually to an even greater degree: compression.

When an http response is transported it may be compressed, this is subject to a negociation between the browser and server, but, long story short, almost all browsers support compression, and most servers (properly configured) do also. For example apache uses mod_deflate.

So while you definetely want to make use of both of these processes, the fact is that compression is usually more important to final transport size. You can see these numbers (Size/Content) for yourself with the Firebug or Chrome DevTool Network tab. alt text

Here is an example of the Google CDN Hosted jquery script

uncompressedcompressed
jquery-1.7.1.js242.42 KB71.20 KB
jquery-1.7.1.min.js91.67 KB32.79 KB
markers.json88.52 KB9.64 KB

Furthermore these two processes are linked. So that when making choices, for example on readability, by using less spaces, or shorter variable names, it may not have the impact that you expect. For example, by looking at our geo-marker data json file, one might say, Hey, if we use tabs instead of our current spacing we would save almost almost 10KB (86.67-75.28 KB) on this file, but if we take compression into account, we see that we actually only save 0.1 KB.

markers.jsonuncompressedcompressed
as currently checked in86.67 KB 9.57 KB
8 spaces indentation86.67 KB 9.57 KB
4 spaces indentation80.16 KB 9.52 KB
2 spaces indentation76.90 KB 9.49 KB
1 tab indentation75.28 KB9.46 KB

Conclusion, I think this is another case where premature optimization is perhaps not the best use of your effort, and that readability or maintainability concerns probably outweigh size-optimization.

Managing Node.js Versions - 16 June 2011 - devops

Since node.js moves rapidly we often need to update it.

As in our last post, we need to manage versions of http://nodejs.org/.

There are different solutions,

I Chose all three in turn, but I am now usign n. Use or install the latest official release:

sudo npm install -g n
n ls             # list available node versions
n --latest       # print the latest version available
n bin <version>  # print the bin path
sudo n latest    # install latest

n manages its installations in /usr/local/n/versions/.

Installing MacPorts - 19 May 2011 - devops

Since OSX dosn't have a native package management solution, installing some opensource software can be greatly simplified by using MacPorts, which fills this void.

Since OSX dosn't have a native package management solution, installing some opensource software can be greatly simplified by using MacPorts, which fills this void.

Case in point, although there is a great installer for getting Git up and running on OSX, when I wanted to add git-svn functionality, I had to use another installation method, MacPorts fit the bill perfeclty

The requirements for installing MacPorts do include having a recent version of XCode, but otherwise it is rather straight forward.

As an example, here is how I replaced the git-osx-installer version of git, with macports' getting svn and bash_completion support:

# remove git-osx-installer version of git first.
sudo rm -rf /usr/local/git /etc/paths.d/git /etc/manpaths.d/git

# install git, git-svn with MacPorts
sudo port install git-core +bash_completion +doc +svn

Wait a while, this pull in a lot of dependancies, like perl, ncurses,.... ---> Dependencies to be installed: bash-completion curl curl-ca-bundle perl5 perl5.12 libidn gettext expat libiconv gperf ncurses ncursesw openssl zlib pkgconfig p5-error p5-libwww-perl p5-compress-raw-zlib p5-crypt-ssleay p5-mime-base64 p5-html-parser p5-html-tagset p5-io-compress p5-compress-raw-bzip2 p5-uri p5-svn-simple subversion-perlbindings apr apr-util db46 sqlite3 readline cyrus-sasl2 neon serf subversion p5-term-readkey python27 bzip2 gdbm python_select rsync popt

I then replaced this section in my .profile

# excerpt from my ~/.profile
# git bash prompt
PS1='\h:\W$(__git_ps1 "(%s)") \u\$ '
# MacPorts Bash shell command completion
if [ -f /opt/local/etc/bash_completion ]; then
    . /opt/local/etc/bash_completion
fi

Just for fun, solve this longstanding irritant:

sudo port install wget

Last thought: That was pretty easy! Must remember to run sudo port -v selfupdate frequently.

Git bash completion - 19 May 2011 - devops

When using git from the bash shell, this setup allows for auto-completion of many useful aspects of git usage.

Here is a bash completion setup which will auto-complete many aspects of git operation from the shell. This allow for auto-completion of commands, sub-commands, long form option, branch and tag names. Here are some examples taken from Wincent Colaiuta's blog

# show subcommands starting with "show"
$ git show<tab>
show          show-branch   show-ref

# autocompletion of branch names
$ git checkout m<tab>
maint    master

# same, but with awareness of "co" alias for checkout
$ git co m<tab>
maint    master

# autocompletion of file names
$ git co WO<tab>
WOAudioscrobbler.h  WOAudioscrobbler.m  WOCommon@           WOPublic@

# autocompletion of options
$ git diff --<tab>
--abbrev                --diff-filter           --ignore-space-at-eol   --no-renames            --shortstat 
--binary                --exit-code             --ignore-space-change   --numstat               --stat 
--cached                --ext-diff              --name-only             --patch-with-stat       --summary 
--check                 --find-copies-harder    --name-status           --pickaxe-all           --text 
--color                 --full-index            --no-color              --pickaxe-regex         
--color-words           --ignore-all-space      --no-ext-diff           --quiet

If you installed Git with the git-osx-installer you already have what is needed, there should be a file:
/usr/local/git/contrib/completion/git-completion.bash, which you can source in your .profile (or .bash_profile). Adding the current branch to you bash prompt, is then just changing your PS1 environment variable

# excerpt from my ~/.profile
# git bash prompt
PS1='\h:\W$(__git_ps1 "(%s)") \u\$ '
# git-osx-installer Bash shell command completion
if [ -f /usr/local/git/contrib/completion/git-completion.bash ]; then
    . /usr/local/git/contrib/completion/git-completion.bash
fi

A New Vocabulary for Power - 09 May 2011 - innovation

Geoffrey Moore, the insightful author of Crossing the Chasm, discusses a new vocabulary for power in a podcast as part of the Entrepreneurial Thought Leaders Seminar from Stanford. Power in this context is the ability to be an industry leader and innovate toward future growth and opportunities.

Geoffrey Moore, the insightful author of Crossing the Chasm, discusses a new vocabulary for power in this podcast as part of the Entrepreneurial Thought Leaders Seminar from Stanford. Power in this context is the ability to lead and innovate toward future growth and opportunities.

Geoffrey Moore is that special kind of author that you feel intuitively is on to something profound, which you must spend some effort to fully grasp; like this perl of wisdom:

    "Power fuels performance, and performance consumes power."

In this talk he offers practical ways to approach the problem of shifting resources to support longer term commercial developpment. He frames the discussion in his Hierarchy of powers which describes an organizations' position and strength in relation each of these five levels:

The part of the discussion which most caught my attention was this discussion about Offer power, and how this kind of investment comes down to three types of efforts which operate at the Product Manager level:

Moore states, that these goals are not mutualy compatible, and cannot be fulfilled by a single team, or a single mandate. I think I get that!

Moore is a great speaker, so I encourage you to listen to the talk. The talk cover parts of his upcoming book: Escape Velocity which is due out in september.

Cloud and DevOps - 27 April 2011 - devops

For our purposes, Devops, is not so much a set of specific tools and practices, as it is an invitation to broaden the scope of many established practices to include aspects which had hitherto been considered unrelated. Devops borows heaviliy from the toolset of Agile Development and extends their use into other fields such as IT Operations.

Nothing new under the sun, All Things Old Are New Again, and yet it sure feels like the field of IT is undergoing constant and qualitative change.

There has been much buzz recently around something called DevOps, and much as Cloud Computing before it, it is rather hard to define precisely.

For our purposes, Devops, is not so much a set of specific tools and practices, as it is an invitation to broaden the scope of many established practices to include aspects which had hitherto been considered unrelated. Devops borows heaviliy from the toolset of Agile Development and extends their use into other fields such as IT Operations.

It should always be kept in mind however that these practices are meant to solve business problems, and should not be viewed as dogma. They must therefore be tailored and sequenced in the context of you own organization.

Examples

It would be unthinkable today to implement a developpment process that doesn't include some kind of version control and testing/QA workflow, but what about server configuration, does it benefit from the same tools' advantages. Tools like Chef and Puppet make these things possible.

Whatever state of maturity of automated testing you operate in, wouldn't it be desirable that your software builds be tested under conditions that closely resemble actual deployment conditions. Tools like Vagrant and Selenium allow us to describe in code some of our operating assumptions about browsers and servers and can be integrated into automated build cycles.

We all have first hand experience with social media tools, such as facebook, flickr, and the benefits of their ease of use. Github has extended some of these social inetraction paradigms into the world of software development, with one-click forking and sharing of entire projects.

The way forward

The widespread availability and low-or-no cost of these tools, makes them quite easy to experiment with, before you commit to them in a more formal and definitive way. So go ahead, get your feet wet, you'll never look back.

I suggest you look at your current operations, no doubt there are some irritants which could use some loving attention; then try to see if some of these pratices can be creatively and efficiently be put to use.

Think: repeatable, measurable, testable.

Also, thinking of your organisations' strategic objectives, try to see if some of these tools and practices could give you a competitive advantage, before your competitors do!

These are a few of my favorite things...

Site-to-Site VPN Setup - 13 April 2011

Finally got around to re-installing routers at home and in the office.

Finally got around to re-installing routers at home and in the office.

I had a really nice setup running Open-WRT on a pair of Linksys WRT54G's for a few years. But when one of the routers died, I thought it would be a good time to upgrade to newer 802.11n wireless, and maybe get some attached storage as well.

I ended up get a Netgear WNR3500L, the (Ahem) open-source router. I also had an extra WRTSL54GS lying around which could go to the office since wireless is infrequently used there and speed is not an issue.

As long as I was re-installing everything, I thought that surveying the latest developments was in order. Things haven't dramatically changed, although everything seems to have gotten incrementally better and easier.

I ended up choosing the Tomato v-1.28 distro, actualy, a rebuilt version of it which has both openvpn and usb-storge support all ready to go.

I managed to bring up a site-to-site vpn setup using only the GUI. Although key generation was done on an ubuntu machine (and the certificates pasted in the GUI).

The VPN came up right away, but to see the network on the client side, I had to follow these instructions.

VPN-Server-Config / Server1 / Advanced :
 check: Manage Client-Specific Options
 enable: push to rd-client|192.168.3.0/24.