Scanned in a few old pictures recently, and found that some of the tools I had written for organizing and synching them would benefit from (at least approximate) date meta-data.
I found a nice universal tool to modify EXIF data: ExifToool by Phil harvey.
It even has a mac installable dmg download.
Here's a nice shortcut for setting all 3 common date fields in a JPEG image.
# show original dates (there are none!)$ exiftool -alldates test.jpg
# set the dates$ exiftool "-alldates=1968:05:16 05:00:00" test.jpg
# show updated dates (there are 3!)$ exiftool -alldates test.jpg
Date/Time Original : 1968:05:16 00:00:00
Create Date : 1968:05:16 00:00:00
Modify Date : 1968:05:16 00:00:00
Linter: JSLint - Warning: JSLint will hurt your feelings.
Integration: Works with multpile configurations, integrated into my Editor, and pre-commit check.
tl;dr
There is one feature I absolutely count on when developping: Code Formatting.
This is the critical feature on which I choose my editor. It is why I loved Netbeans, which I since abandoned because it feels too heavy. Now I have moved on to Textmate, but I have a nagging feeling that the platform lock-in may yet bring me to Sublime Text 2 soon. (I have failed at Vim too many times to count...).
Beautifying
So TextMate has a decent built-in code formatter, and even a Javascript specific one, based on a 2007 php version of Einars Lielmanis's JSBeatifier. That beautifier is still one of finest for javascript, but the implementation shipped in TextMate needs an update.
There are lots of third party bundles which integrate different flavors of this beautifier library into Textmate.
Linting
Next is linting, which is a code quality tool. For our puposes there are two choices JSLint and JSHint.
The easy part is accepting that any set of enforced quality policies is better that NONE.
The harder part is to refine the configuration choices to maximize quality and safety while preserving enough flexibility and convinience.
Integration
The crux of the choices to make here depend on the workflow requirements.
Ideal Conditions:
No restriction on platform or Editor.
Adapts to different set of style/coding practices (different projects)
Formatting is integrted into Editor(s), at least on each platform.
Linting is performed as you type, and can be checked en masse at checkin (git pre-commit hook).
For my current Editor, Textmate, Javascript Tools, by Thomas Aylott aka subtleGradient, which I installed with Get Bundles.
There is also JSTools by Adam Hope, but it does not seeem to be as current.
I also considered JSLintmate for jslint/jshint, but wanted a more integrated solution (that I didn;t write myself)
While having a discussion recently on the tradeoffs of different data representations of our json data for an open data project, I thought it important to put this aspect of optimization into context.
Minification is the process by which javascript (json) text representation is minimized by removing all unnecessary characters from the source, like white-space. Sometimes it involves safe-rewiting variable names for shorter ones, etc.
The primary objective of minification is to reduce the time required to transport the javascript, by reducing its size.
But there is another factor which affects the transported scripts' size, usually to an even greater degree: compression.
When an http response is transported it may be compressed, this is subject to a negociation between the browser and server, but, long story short, almost all browsers support compression, and most servers (properly configured) do also.
For example apache uses mod_deflate.
So while you definetely want to make use of both of these processes, the fact is that compression is usually more important to final transport size.
You can see these numbers (Size/Content) for yourself with the Firebug or Chrome DevTool Network tab.
Furthermore these two processes are linked. So that when making choices, for example on readability, by using less spaces, or shorter variable names, it may not have the impact that you expect. For example, by looking at our geo-marker data json file, one might say, Hey, if we use tabs instead of our current spacing we would save almost almost 10KB (86.67-75.28 KB) on this file, but if we take compression into account, we see that we actually only save 0.1 KB.
markers.json
uncompressed
compressed
as currently checked in
86.67 KB
9.57 KB
8 spaces indentation
86.67 KB
9.57 KB
4 spaces indentation
80.16 KB
9.52 KB
2 spaces indentation
76.90 KB
9.49 KB
1 tab indentation
75.28 KB
9.46 KB
Conclusion, I think this is another case where premature optimization is perhaps not the best use of your effort, and that readability or maintainability concerns probably outweigh size-optimization.
Since OSX dosn't have a native package management solution, installing some opensource software can be greatly simplified by using MacPorts, which fills this void.
Case in point, although there is a great installer
for getting Git up and running on OSX, when I wanted to add git-svn functionality, I had to use another installation method, MacPorts fit the bill perfeclty
The requirements for installing MacPorts do include having a recent version of XCode, but otherwise it is rather straight forward.
Here is a bash completion setup which will auto-complete many aspects of git operation from the shell.
This allow for auto-completion of commands, sub-commands, long form option, branch and tag names.
Here are some examples taken from Wincent Colaiuta's blog
# show subcommands starting with "show"
$ git show<tab>
show show-branch show-ref
# autocompletion of branch names
$ git checkout m<tab>
maint master
# same, but with awareness of "co" alias for checkout
$ git co m<tab>
maint master
# autocompletion of file names
$ git co WO<tab>
WOAudioscrobbler.h WOAudioscrobbler.m WOCommon@ WOPublic@
# autocompletion of options
$ git diff --<tab>
--abbrev --diff-filter --ignore-space-at-eol --no-renames --shortstat
--binary --exit-code --ignore-space-change --numstat --stat
--cached --ext-diff --name-only --patch-with-stat --summary
--check --find-copies-harder --name-status --pickaxe-all --text
--color --full-index --no-color --pickaxe-regex
--color-words --ignore-all-space --no-ext-diff --quiet
If you installed Git with the git-osx-installer
you already have what is needed, there should be a file: /usr/local/git/contrib/completion/git-completion.bash,
which you can source in your .profile (or .bash_profile). Adding the current branch to you bash prompt, is then just changing your PS1 environment variable
# excerpt from my ~/.profile
# git bash prompt
PS1='\h:\W$(__git_ps1 "(%s)") \u\$ '
# git-osx-installer Bash shell command completion
if [ -f /usr/local/git/contrib/completion/git-completion.bash ]; then
. /usr/local/git/contrib/completion/git-completion.bash
fi
Geoffrey Moore, the insightful author of Crossing the Chasm, discusses a new vocabulary for power in this podcast as part of the Entrepreneurial Thought Leaders Seminar from Stanford. Power in this context is the ability to lead and innovate toward future growth and opportunities.
Geoffrey Moore is that special kind of author that you feel intuitively is on to something profound, which you must spend some effort to fully grasp; like this perl of wisdom:
"Power fuels performance, and performance consumes power."
In this talk he offers practical ways to approach the problem of shifting resources to support longer term commercial developpment. He frames the discussion in his
Hierarchy of powers which describes an organizations' position and strength in relation each of these five levels:
Category power
Company power
Market power
Offer power
Execution power
The part of the discussion which most caught my attention was this discussion about Offer power, and how this kind of investment comes down to three types of efforts which operate at the Product Manager level:
Differentiate: distinguishing from competitors
Neutralize: catch up to the competition
Optimize: extract resources to use elsewhere
Moore states, that these goals are not mutualy compatible, and cannot be fulfilled by a single team, or a single mandate. I think I get that!
Moore is a great speaker, so I encourage you to listen to the talk. The talk cover parts of his upcoming book: Escape Velocity which is due out in september.
Nothing new under the sun, All Things Old Are New Again, and yet it sure feels like the field
of IT is undergoing constant and qualitative change.
There has been much buzz recently around something called DevOps, and much as Cloud Computing before it,
it is rather hard to define precisely.
For our purposes, Devops, is not so much a set of specific tools and practices, as it is an invitation to broaden the scope of many established practices to include aspects which had hitherto been considered unrelated. Devops borows heaviliy from the toolset of Agile Development and extends their use into other fields such as IT Operations.
It should always be kept in mind however that these practices are meant to solve business problems, and should not be viewed as dogma. They must therefore be tailored and sequenced
in the context of you own organization.
Examples
It would be unthinkable today to implement a developpment process that doesn't include
some kind of version control and testing/QA workflow, but what about server configuration, does it benefit from the same tools' advantages. Tools like Chef and Puppet make these things possible.
Whatever state of maturity of automated testing you operate in, wouldn't it be desirable that
your software builds be tested under conditions that closely resemble actual deployment conditions. Tools like Vagrant and Selenium allow us to describe in code some of our operating assumptions about browsers and servers and can be integrated into automated build cycles.
We all have first hand experience with social media tools, such as facebook, flickr, and the benefits of their ease of use. Github has extended some of these social inetraction paradigms into the world of software development, with one-click forking and sharing of entire projects.
The way forward
The widespread availability and low-or-no cost of these tools, makes them quite easy to experiment with, before you commit to them in a more formal and definitive way. So go ahead, get your feet wet, you'll never look back.
I suggest you look at your current operations, no doubt there are some irritants which could use some loving attention; then try to see if some of these pratices can be creatively and efficiently be put to use.
Is there something you can automate ?
Is there something you can measure ?
is there a functionality you can expose through an API ?
Think: repeatable, measurable, testable.
Also, thinking of your organisations' strategic objectives, try to see if some of these tools and practices could give you a competitive advantage, before your competitors do!
Finally got around to re-installing routers at home and in the office.
I had a really nice setup running Open-WRT on a pair of Linksys WRT54G's for a few years.
But when one of the routers died, I thought it would be a good time to upgrade to newer 802.11n wireless,
and maybe get some attached storage as well.
I ended up get a Netgear WNR3500L, the (Ahem) open-source router.
I also had an extra WRTSL54GS lying around which could go to the office since wireless is infrequently used there and speed is not an issue.
As long as I was re-installing everything, I thought that surveying the latest developments was in order.
Things haven't dramatically changed, although everything seems to have gotten incrementally better and easier.
I managed to bring up a site-to-site vpn setup using only the GUI.
Although key generation was done on an ubuntu machine (and the certificates pasted in the GUI).
The VPN came up right away, but to see the network on the client side, I had to follow
these instructions.
keys generated on cantor:/etc/openvpn/easy-rsa/[keys|vars]
archived on [cantor|goedel]:/archive/mirror/tomato/[netgear-wnr3500l|wrtsl54gs|openvpn-easy-rsa]