Thursday, December 29, 2011

Selling Yourself Out of Projects

So, a friend tells me about a buddy who "has a software project you can help him with."

Prospects are very subjective creatures. I tend to think that a professional, whether a software craftsperson or an engineer, or something else, has a kind of fiduciary ethic even toward prospects. If a prospect asks about developing a software project, my immediate concerns are
  1. Does the client really need custom development?
  2. Is there an existing service that already meets their needs?
  3. Will the recurrent tangible and intangible costs  (of any desired path) present an impractical future burden ?
Unqualified prospects would result in No to the first question and/or Yes to the second question.  The third question is one that no salesperson in their right mind would venture to probe, but it represents a core value of the engineering ethos.  Is the prospect willing and able to accept the costs?

I'm in my right mind, but I'm not as much a salesperson as my friend.  So I informed the prospect of his options and alternatives early, and gave him some context of the cost. That approach pre-qualifies many prospects out of project, or at least, sends them looking for someone else who is more willing to lead clients down a primrose path. 

The thing is, my friend would have just jumped into the job, sweeping questions about the likely costs under the rug,  playing dumb about the prospect's needs and getting them to pay for scraps of functionality as soon as possible.  This sort of drive-by software development is all too common. 

I'd rather sell myself out of a project than put blinders on and lead prospects down a rabbit hole. That may not be the magical thinking that fuels so many Law of Attraction acolytes, but it is the path an ethical person will tend to follow. 

Thursday, December 22, 2011

Unconscientious Prospects

I was contacted recently via email, by a prospect seeking assistance with a small PHP/MySQL project. Now, disregarding my personal distaste for Rube Goldberg skunkworks cobbled together in PHP, I offered to help anyway. Paraphrasing the conversation it went something like this:

Hi, I was given your name by *********.  I have a web based tool and would like to make minor modifications to it. I want to remove some functions for a one-time use.  It should not be very involved or big task but I need to have it up and running  in three weeks.  If you think you might be interested please let me know.

Me: (replying a few hours later)
I'd be happy to discuss your needs for the Web tool.  If you would like to talk, I have some time this week. Let me know. 

Prospect: (no reply for two days)

Me: (two days later, this time with a return receipt attached)
Hi, I just wanted to follow up to make sure you received my reply. Are you still looking for assistance? 

Hi, got follow up from someone else before your reply.  Will need to do some other work and will get in touch with you when I am ready.

My immediate knee-jerk reaction to this was, "hey, someone beat me to it, good for them!" but then a couple of thoughts occurred:
  • It is unprofessional to ask someone for help, then ignore their attempts to communicate.

    Everyone gets distracted once in a while, but even if you aren't immediately going to do business with someone, if they extended the courtesy of a reply you owe them the same courtesy in return.

    The oblivious lack of courtesy of dropping an email exchange is like walking through a door and letting it slam shut in the face of the person behind you. It isn't very pleasant to the slam-e, and it reflects poorly upon you.
  • Conversely, if you judge suitability based upon the how quickly one responds, you aren't looking for a relationship but a one-night stand.

    Look, as a freelance programmer I'm not seeking to date you or marry you.  But I do have an expectation of a certain amount of reciprocal attentiveness and return on my investment of time.  Treating your search for help as if you are casting chum and netting the first fish that bites, shows that you are not very conscientious. Or at least, when push comes to shove you become neglectful.   
I'm less and less interested in giving people second chances, perhaps because I've often been so long-suffering on the first go-round.  What is really beginning to annoy me though is the attitude that we're just "resources" or "commodities".  Damn it, Jim, I'm a person, not a commodity.  If you care so little about who you employ and you can't communicate and you consider the effort to be so small, why would I want to bother doing business with you?

Friday, December 16, 2011

CSS Grammar Considered Wrong

I get a little nit-picky every time I look at CSS.
Take the way that the media queries spec handles dimensions like width and height. They couldn't just say "width", no, no, that would be just too easy. Instead they say min-width, width, and max-width.

Flea microphotograph

CSS grammar is meant to be declarative, but that's not the reason for using prefixes. The CSS media queries section on media features tries to explain the rationale:

Most media features accept optional ‘min-’ or ‘max-’ prefixes to express "greater or equal to" and "smaller or equal to" constraints. This syntax is used to avoid "<" and ">" characters which may conflict with HTML and XML. Those media features that accept prefixes will most often be used with prefixes, but can also be used alone.

Never mind that CSS selectors already use the greater-than symbol. The grammar has gotten land-locked, but it has never used commonly accepted notation for mathematical expressions. You know, expressions such as:

width: (160,320)   (interval notation)

160 < width < 320   (set builder notation)

(160..320,true).include? width   (Ruby, using a range)

width BETWEEN 160 AND 320   (SQL)

Whatever. The nit I'm picking is not that the CSS editors have had to work with what they've inherited.

Whether an historical accident or pragmatic contingency, using prefixes (min-, max-) on property names as grammatical substitutes for logic expressions (>=, <= respectively) constitutes a multiplication of entities way beyond necessity. Instead of one property (width) within two scopes (logical and device), CSS presents us with:







aspect-ratio   (CSS pixel width/height)






orientation  (width > height ? landscape : portrait )

Them's A DOZEN MORE properties, baby.
That's just for one stinkin' property, so don't be callin' me all pedantic on this.   If y'all leave nits alone they grow and soon enough you find yourself infested with lice. (Not that I have any experience with lice.)

The gratuitous prefixes don't make the resulting media queries any more elegant to express:

@media screen and (min-width: 160) and (max-width: 320) {
  stuff {  goes: here;  }

The problem is that CSS uses a fixed set of discretely expressed predicates for continuous and dynamically varying values. What I want it to say instead is something more along the lines of:

@media screen and (160 < width < 320) )   {
  stuff {  goes: here;   }

@media screen and (width: range(160..320) ) )   {
  stuff {  goes: here;   }

CSS is what it is and for better or worse this design choice passed through the editing and public commentary process intact.  Other efforts can still learn from its missteps as much as its successes.  Injecting special-case grammatical devices is an easy way to make a language available early but injecting them often will turn even a greenfield language into yet another Rube Goldberg device. As a device, CSS has more than a little Rube Goldberg in it. 

Tuesday, December 13, 2011

Dropped Google Ads

For a long time, I had a little sidebar of Google ads displayed on my blog. No more.

At the time it seemed like the thing to do. Syndicated advertising was all the rage, everyone was supposed to find ways to monetize their sites, blah blah blah...

I just threw up the sidebar to go through the steps needed, in case I needed to do it for someone else and just out of curiosity to see what it would generate. Well, this blog is, should we say, along the road less travelled, so the ads never amounted to anything monetarily.

It turns out that Google ads looked really sucky when the mobile template was active. Actually, they look sucky any time, but their relative sucky-ness increases as the display window decreases. I don't know why, they were just ugly.

There was another downside too: besides detracting from the visuals, syndicated advertisements contradict one of the messages of the blog: our society has put so much of its human capital into manipulating perceptions that it has eaten away at workplace ethics and cannibalized its equity in the process.

Bye bye, Google ads, I'm not really going to miss you.

Monday, December 12, 2011

Musings on consciousness

Janice Janes had a very informal, very anecdotal hypothesis: before the human condition first acquired consciousnes, our existence was dictated by a non self-aware construct he termed the bicameral mind. Janes argues that the development of consciousness is in part language based, and in part wired into our brains, based upon the authority structures of the culture. Consciousness thus developed over time as the challenges of dealing with multiple interacting cultures forced the simple structures to break down and be reconstituted, generation after generation.

I don't know whether Janes' ideas were away in left field or in any way mainstream, but they were intriguing. Through his lens it was easy to interpret xenophobia, black and white doctrines, separatism, etc. as attempts by a subculture to regain some measure of the simplicity of the less self-reflective mind.

Janes' assumptions that consciousness was/is an ever progressing evolutionary eventuality are wrong. Cultures can and do move backwards towards a less than fully conscious hive mind mentality. If anything, consciousness is Lamarkian, not Darwinian. Consider the culture of Afghanistan, or, closer to home, religious cults. Fanatical followers of media personalities such as Oprah or Jim Bakker (not to say they are equivalent however) oven have great difficulty separating their own opinons from those espoused by their glorified leaders. Prejudice is the antithesis of consciousness.

And it isn't just about collectivist authority structures, but personal identity. Experience of the world drives consciousness about that world. Less experience implies a narrow, specialized focus but also a limit to consciousness. Ignorance is bliss. Experiences by proxy, as through the Web and TV, create a consciousness that does not accurately image or respond to the physical and social world. In this case, ignorance is not bliss as the conflicting and falsified viewpoints creates unnecessary and unhelpful reflection.

Thursday, December 8, 2011

Autocorrective Crowd Sourcing and Autocatalytic Systems Programming

So, a friend sends me this link about how Recapcha is being used to pawn off digitizing work to unsuspecting users, in an ingenious manner:

Setting aside for the moment any ethical considerations of virtual slavery and digitally indentured servitude, it really is a masterful plan.

The crowd sourcing model can even be used to clean up application data, even when the crowd is relatively small.

Another friend related the case where a US federal government department contacted him, to consult on  scrubbing business names from a terabyte scale database. The chief difficulty is that much of the data originated by people typing in whatever they wanted. My friend rightly points out that it is always cleaner to select a coded value from a list, rather than allow free form typing. That is not possible when the values vary, as human contrivances (company names) frequently do.

Systems may be designed to allow people to type in new names in a free form manner. Free form data input means Garbage in, Garbage out: there is no way to identify the unbounded variation of lexical strings by which a company name is known. Consider the morphisms of the Johnson & Johnson corporate apellation:

Johnson and Johnson
Johnson & Johnson
Johnson & Johnson, Inc.
Johnson and Johnson, Incorporated
J & J,
J& J,
J. & J. ,
J and J

The letter case is not such a big deal. Neither is the variation of whitespace, or even the substitution of the ampersand for the word "and".  The problem is that the set of these names is finite but unbounded and the set changes over time. There is no easy way of identifying entries that go with the canonical name for a given organization.

Or isn't there? Applications capture this data in daily interactions with users. Combo boxes can list previous entries and make coded selections, while allowing novel inputs.  So use string pattern matching to do a lookup of the possibilities based upon the first few characters typed. But don't look into the old records, look into a map of generated and collected synonyms instead.

Allow the user to start entering the name of the company. If after the first few letters, you find an exact match on a canonical name and the user accepts it, you're done: just take the code for the canonical name. If there are no perfect matches, generate a set of weighted regexp patterns based upon the user input; any original names that match and are accepted by the user are recorded as a probable match for future lookups.

That is admittedly rather sketchy, but I'm sure someone has implemented a few schemes along these lines to clean up old data. What would some of the advantages be?

  • Cost is not incurred when the effort has no value.
    • Data for companies that are no longer participating is not touched.
  • Effort is expended whenever it is needed.
    • Data for companies that are participating often are improved more rapidly. 
  • Refinement of the data set is embedded as a design feature of the system.
    • Not one-shot: Cleaning of the data continues over the lifespan of the system.
    • Tracks the moving target: Data set targets are ambiguous symbols representing shifting human contrivances (businesses).  

Can the strategy work?  Biological systems do something analogous in the form of DNA error detection and repair. They are far from perfect, but organisms maintain their integrity for decades, up to hundreds of years... thousands if you consider some of the oldest organisms like trees and fungi.

The key idea here is accepting that determinism in complex systems is a convenient myth that is sometimes more trouble than it is worth.  That realization opens the way for explicitly probabilistic feedback loops -- just one more way for complex computer systems to fulfill their original design intent to illuminate the processes of life.

Tuesday, December 6, 2011

Attending Triangle JS Meetup

I'm attending the Triangle JavaScript Meetup tonight. They're discussing Toura Mulberry.

the original photo is by mauroguanandi on Flickr
The open source mobile development and deployment tool, not the fruit. @rmurphey from Toura is presenting.

Mulberry is a toolchain and deployment strategy wrapped around PhoneGap. The unofficial description is that it follows a Rails-like approach to the toolchain. As far as deployment is concerned, Mulberry acts as an HTML browser in a private application.

Mulberry attempts to be a full-cycle development stack for applications deployed as embedded Web-browser (using Sinatra?), displaying content from JSON. Most of the responsibility for presentation sits on the mobile device, implemented in JavaScript. In fact, most of the HMTML and markdown gets converted into JSON content data, which is then composed into DOM by JavaScript on the client.

Like Rails, Mulberry is opinionated, and provides generators for scaffolding. It relies upon the exposure of content through a "node" interface to publish data. Several of the artifacts created with the generator are configuration files for things like the media form factors, devices, site map, default content (in markdown), routes, etc. The system is then immediately available to test using "mulberry serve", the built-in server, without a build process. Also like Rails, the system uses special URL formats to convey configuration settings in the running app.

Basically, Mulberry is a single page, off-line application meant for quick deployment.

Mulberry sends most of the content down as a static set of nodes, but the deployment wrapper supports polling for content refreshes. Currently that functionality is limited to content, not including templates or behavior, due to Apple Store guidelines.

Also mentioned

Saturday, December 3, 2011

Thinking matters, even in Condé Nast magazines

I'm on a quest to get healthier despite my programmer ways, and I do a fair bit of researching. To that end, I review medical literature from databases like PubMed/Medline, and look for the reasons supporting or conflicting with pop-culture (including government) recommendations.

In my perusing, I came across a nutritional site about the Fructose content of various foods, operated by SELF Magazine.  That's good, because fructose acts like alcohol to the liver, leading to cirrosis, insulin resistance and diabetes, which is bad.

SELF is one of those Condé Nast periodicals aimed squarely at the well-off-but-otherwise-insecure-young-female crowd.  I'm trying to limit the intake of high fructose foods in my lifestyle, and if is informative to hot, wealthy, young women surely I can use it too.

Except it doesn't matter if we guys have almost all the same genes as gals and fructose doesn't seem to respect any of the differences...

...because as presented the site offers utterly bogus information. In fact, the data is often upside-down.

Let's take a look and make some conclusions based on the site's rankings:

  • Molasses has half the fructose content of iceberg lettuce
  • Applesauce is ranked higher (worse) than honey
  • Plums and onions are almost the same
  • Celery is at #179, within the top 200, having 6.4 grams of Fructose
  • Cabbage is ranked higher than Raisin Bran cereal

By ranking the foods based on a fixed calorie count, rather than on a realistic volume or weight basis, the Condé Nast list implies utterly terrible lifestyle advice. It can't be that hard to figure out that foods like cabbage contribute a tiny amount of fructose to the diet.   People who rely upon SELF for nutritional information are not at all well-served by such a poorly designed guideline. Clearly, the folks over at Condé Nast aren't very serious about presenting good nutritional information to women.

If I were Sarah Chubb (president of Condé Nast Digital) I'd order the nutrition site taken down immediately and replace the person who had editorial responsibility for it.  Chubb should find people who can reason quantitatively, and think about the content, not just about how it is presented.

Monday, November 28, 2011

Forgetful Filesystems

There is a slather of news and criticisms aimed at IBM for its recently awarded patent application for a document aging file system.

I'm going to come to Big Blue's defense here, and say that it makes some sense in certain contexts.

There is no such thing as a non-degrading media; digital storage simply reduces the error injection rate infinitesimal proportions and increases the rate at which refreshes occur by orders of magnitude.  Purposefully degrading the fidelity of the data stored on a system is a way to put the lifespan of the storage on par with the lifespan of pre-digital technologies, and thus be enabled to offer a similar value proposal.

Individuals, businesses, and governments generate a lot of junk data. Should this junk data be treated with such high fidelity value as the stuff we really care about?  As long as we get to chose between what constitutes the good stuff and the junk, I'm all for having forgetful filesystems.  They are inevitable.

There are a LOT of business and government records in particular that would be well placed into a round file.  I'm told that some of my tax records, for instance, should be kept around seven years.  Here is a way I can put such records, that to me are truly garbage, in a round file and know that eventually they'll be discarded just like paper copies, and the storage space can be reclaimed and sold to someone else.

We couldn't save such junk data when the medium was paper, but why maintain it just because it went digital?

Sunday, November 27, 2011

OSX Wireless Printing with Leftover PC printers

Apple's AirPrint has very little in the way of direct vendor support. HP seemed to be the only printer maker that supports the protocol.  Now Epson and Cannon have joined in.  

Epson printers suit me just fine. With an external inkwell, the cost of operation is a tiny fraction of the HP ink suckers. I have an older Epson though. What to do?

I dug up an old TrendNet TE100-P1U wireless print server.  The install manual claims that only PostScript printers are supported under OSX. Not so!

Just install the Gimp Print CUPS drivers. You then add the print server as a printer under System Preferences, Scan and Print settings. Set the printer type to your printer's model (mine is an Epson 9400Fax).   Print a test page and... it worked! So now I can print from OSX wirelessly to my old PC printer, without the old PC.  

No, as to AirPrint... OSX Lion can act as a host to your IOS devices via Activator 2.  It works fine sharing the print server printer, using the cups driver. 

Monday, November 21, 2011

Textured Typeography with CSS3

Let's suppose I have a textual element that serves as a title and I don't want it to be a plain boring flat color. CSS3 has a nifty way of letting an image serve as a mask for an element.

Let's take an image with some texture to it, say, the Lava effect from Gimp:

We can apply this to an element using the mask-image property:

.title {
    -webkit-mask-image:  url(; 
-o-mask-image: url(; 
-moz-mask-image: url(; 
mask-image: url(; 

This gives us:

Textured Text

Wherever the image mask is fully opaque, the original element content shows, and wherever it is fully transparent, the original element content is occluded. Between opaque and transparent, bits in the mask will show(or hide) content to the degree that they are partially opaque(or transparent).

Saturday, November 19, 2011

College Computer Science: We don't speak with LISP here.

A presenter at a conference was dismayed at how poorly the computers languages he was taught at NYU translated into practical application on the Web. I asked if LISP was among the languages. It wasn't.

The look on some of the attendees' faces seemed to suggest I was being a wise-guy. I wasn't trying to be snarky; I expected his response because I'd heard it repeatedly and found through personal investigation of engineering curricula that it is not in the mainstream.


Substitution of terms in expressions is a fundamental concept. It is so fundamental that kids who are allowed to handle money will do it intuitively when making change. It is bizarre that such a fundamental hypothesis of calculation in computer science and mathematical logic is elided in modern computer science degree programs.

(The same kids will run into difficulty reframing their experience, when given non-monetary challenges in math classes. They get counting and computing when it is informed by their use of language; they lose the cognition when the language is removed.  So add the vocabulary and operators of substitution to their repertoire , early, often, and concretely, and perhaps they will actually start intuiting what computation is all about.)

To the best of my recollection the first standardized transformation language for markup content was ISO/IEC 10179:1996 (DSSSL). It used a subset of Scheme (a LISP dialect) and in turn informed the development of XSL(T). Functional languages (Javascript is another example) are excellent tools for processing markup. University educators need to get beyond their own reframing difficulty to recognize the obvious substitution of "markup" for "Web," and Scheme to re-introduce a little more of a LISP in their discourse.

Thursday, November 17, 2011

Emotional Maturity and Divorce

My spouse has been married to me for over 25 years. The relationship is symmetric: I've been married to her for precisely the same amount of time. It certainly isn't reflexive: we're may be married to each other but I'm not married to myself.  And it isn't transitive either; if I was married to someone else, that wouldn't mean that my two wives would be married (and anyway that would be bigamy). Therefore:

Marriage is Not An Equivalence Relation 

Look, I'm not bragging about it.   I'm no more worthy of accolades on this matter than any one of my divorced friends and relatives.  Life has not always a bed of roses either;  just as in the economy,  externalities shift and expectations change,  excitement can give way to disappointment, disappointment to grief, and new realities set in.  But in our 25 years of marriage, we've been witness to a great deal of divorce, which substantively impacted our family and relationships.

No matter how close you imagine yourself to be to someone, you're not that person and they are not you. The unity is plural. Perceptions matter a little, to be sure, but having a good grasp of reality and an emotional balance between empathy and self-interest matters much more. That means taking time to be reflective together, actively exercising empathy and not just entertaining yourselves on a date night and slugging through chores the rest of your time. Or in the case of those struggling with a loss or divorce, taking a year off to normalize your emotional response to relationships.

Wednesday, November 16, 2011

The Pakled Philosophy of Development

A rambling, opinionated, and very poorly justified rant. Please don't shoot me, I'm just venting. 

A few months ago I blogged on my disenchantment with PHP.  It isn't just PHP, but Java, Perl, and ASP/.NET that make me feel similarly estranged.  The thing is, I want to work with languages, platforms, and systems that don't really, truly, deeply suck, on projects that are worth doing.  Sadly, that's a rare combination. 

I'm not sure exactly what it is about server-pages platforms that gives rise to such feelings of discomfort. I've worked with Joomla and Drupal and ASP sites, so perhaps it is in part the layering of immature cultural cruft inherent in their codebases that gives me the stray thoughts of "what a freakin' mess," and "are you kidding me?"  

But as convoluted as some poorly written components can be, it isn't the primary reason these platforms suck. They suck because, for a dollar you put in today, you have to put a dollar plus change more in tomorrow just to keep the stack from completely collapsing in on itself.  In effect, these platforms encourage a kind of codebase Ponzi scheme or a development Hell.

"We look for things."

All development builds upon previously constructed artifacts. Those artifacts must be available, demonstrably functional, and comprehendible to contribute to the next generation.  Our objectives should not be merely to "make things go", but should include incorporating our knowledge into our archetypal forms, a process of language refinement.  With PHP, Java, and so many other languages, we focus our efforts on entraining more and more information into the platforms, like some sort of junk DNA as it were; the languages and libraries grow in complexity much faster than they grow in value. Similarly, artifacts in these languages seem to have a very limited impact in advancing the system to higher levels of functionality. 

Web Developers using server-page approaches produce piecemeal patchworks of solutions that last for a few months to a few years. Do such systems get increasingly easier to understand and maintain, or do they get increasingly cluttered and ever more risk laden?  Is there repeatability, refinement, increasing transparency, and data unencumbered by private licensing, or a daily slog of Development Hell?

PHP systems rarely gain the level of maturity needed to be approach either self-consistency or completeness and closure over a domain.  The intelligent fall-back position when you're ignorant is Lean -- sacrifice completeness and reduce the feature set to that which is minimally viable. Focus on what provides value. But if you just want to make it go, Pakled style, even when provided sufficient resources it won't contribute to the process of exposing and refining the domain language. Developing a domain language to incorporate new knowledge simply isn't a value proposition that is often pursued in that development culture, and it isn't well supported by the language. 

In mathematics, there is the concept of the derivative. In the context of a function that plots a curve, the derivative tells you the rate at which the function output values are changing at a given point on the curve: it tells you the slope, as it were.   In order for solutions to remain viable over time, the slope of the curve of revenue generating features must be greater than the slope of the curve of hidden costs and intangible liabilities.  Otherwise, you're creating a shell hollowed out of real value.  My sense on server page technologies is that feature incorporation has a relatively flat slope, the cost curve increases more rapidly, and hidden liabilities arise very abruptly and without warning.

For most clients of technology, it is a lose-lose proposition: they know they need to spend the money to stay in the game, but the game is rigged with goods frequently subject to erratic and unplanned obsolescence. 

But that begs two questions. Are there any more sustainable alternatives? Would a more sustainable alternative slow down the pace of innovation? I don't pretend to know the answers but after blathering on in this article, I at least understand some of the reasons why languages like PHP rub me the wrong way. 

Monday, November 14, 2011

Groan! Sass is great, but incomplete

Seduced by the lure of smaller, cleaner, neater, and more responsive stylesheets, I began moving my home page to Compass and SASS.  I installed compass, and got the basic framework in place quickly.

Then I went away for a while.

When I came back, it was because I had the brilliant idea to clean up the media selectors I had made a mess of. "Variables," I thought, "Why, SASS' variable interpolation would make that much easier!"

Sadly, this is not yet the case as of version 3.1.10 :

Syntax error: Invalid CSS after "(max-width: ": expected expression (e.g. 1px, bold), was "$phone-height) {"
        on line 33 of /Users/mamiano/workspace/
        from line 1 of /Users/mamiano/workspace/


Well, they are pushing to fix it in version 3.2.  The thought occurs to me that for simple cases, a regexp replacement could do the trick if it could be inserted before Compass processed the file through SASS. 

Sunday, November 13, 2011

Getting fuse4x installed on OSX Lion

I tried out Fuse4x and sshfs for OSX, through Hombrew, this week-end. It should have been a piece of cake, but apparently the recipes aren't very good about telling when another fuse package is installed.

As recommended, I did the install:
$ brew install fuse4x sshfs

Homebrew cheerfully informs me that sshfs won't work unless I sudo copy a file, so I follow its instructions. Seems a bit strange to have to do this manually:
$ sudo cp -rfX /usr/local/Cellar/fuse4x-kext/0.8.13/Library/Extensions/fuse4x.kext /System/Library/Extensions

$ sudo chmod +s /System/Library/Extensions/fuse4x.kext/Support/load_fuse4x

Alas, things just didn't work:
$ sshfs  ~/mnt/
/Library/Filesystems/fusefs.fs/Support/fusefs.kext failed to load - (libkern/kext) link error; check the system/kernel logs for errors or try kextutil(8).
the MacFUSE file system is not available (71)

Well, actually, it looks like the kernel extension never even loaded.  OSX is supposed to attempt to auto-load them when a fuse filesystem is mounted, but it doesn't look like it succeeded. I use kextunload and kextload to get it to load the extensions:
$ kextunload /System/Library/Extensions/fuse4x.kext/

(kernel) Kext org.fuse4x.kext.fuse4x not found for unload request.
Failed to unload org.fuse4x.kext.fuse4x - (libkern/kext) not found.

$ kextload /System/Library/Extensions/fuse4x.kext/
Yet trying sshfs again gives the same error. What gives?

I look for more info on homebrew and fuse4x forums and bug lists. Wading through Web pages -- they all describing the solution to the link error as (to sum up) "use fuse4x".  I *am* using fuse4x. One bug report on homebrew hinted at a possible cause to the problem.

It looks like at some point, I had a MacFuse installed,  and it left an orphaned copy of /usr/local/lib/pkgconfig/fuse.pc:
$ cat /usr/local/lib/pkgconfig/fuse.pc

Name: fuse
Description: File System in User Space (MacFUSE)
Version: 2.7.3
Libs: -L${libdir} -lfuse -pthread  -liconv
Cflags: -I${includedir}/fuse -D__FreeBSD__=10 -D_FILE_OFFSET_BITS=64

So I brew uninstall fuse4x, remove the fuse.pc file, and brew install fuse4x.
The homebrew recipe symlinked to the correct fuse.pc file, and I hoped all was good.

It was not good. Same error.  I check the system kernel error logs:
 kernel[0]: fuse4x: starting (version 0.8.13, Nov 11 2011, 17:54:24)
 kernel[0]: kxld[]: The following symbols are unresolved for this kext:
 kernel[0]: kxld[]: _OSRuntimeFinalizeCPP
 kernel[0]: kxld[]: _OSRuntimeInitializeCPP
 kernel[0]: Can't load kext - link failed.
 kernel[0]: Failed to load executable for kext
 kernel[0]: Kext failed to load (0xdc008016).
 kernel[0]: Failed to load kext (error 0xdc008016).

Egads! Unresolved symbols? Something just isn't right.  Looks like the same 32 bit MacFuse garbage.

At this point, I'm seriously doubting the sanity of the the homebrew fuse4x/sshfs recipes. Maybe someone thought they were working, as a coincidence of them already having a working fuse4x kernel extension installed.

Well, it turns out that the MacFuse code was not fully uninstalled, and the brew recipe simply accepts the existing files without complaint or comment:
$ ls -ld /Library/Filesystems/fusefs.fs
drwxr-xr-x  4 root  wheel  136 Dec 19  2008 /Library/Filesystems/fusefs.fs

December of 2008 ???!  That's a little dated for a filesystem I installed today (Nov 2011). Even if it were unzipped from a file (which it wasn't), the datestamps of an active project should be more recent than 2008.

My experience with the fuse4x recipe is that it doesn't do squat for detecting previously installed files.... it just accepts them without complaint.

If you've got a previous install of anything remotely resembling MacFuse, clear it out, clean it out, and purge every remnant from your system before monkeying around with fuse4x.

$ /Library/Filesystems/fusefs.fs/Support/

MacFUSE Uninstaller: Sudoing...
MacFUSE Uninstaller: Can not find the for MacFUSE Core package.

Alas, if you upgraded to OSX Lion and MacFuse was previously installed, a LOT of garbage was left around. Kind of makes you wonder, why Apple still has no package management system, but I digress...
$ brew uninstall fuse4x
$ brew uninstall sshfs
$ rm -rf  /usr/local/include/fuse
$ rm /usr/local/lib/libfuse_*
$ sudo rm -rf /Library/Filesystems/fusefs.fs
$ sudo rm -rf /System/Library/Extensions/fuse4x.kext/
Hopefully, I'm not clobbering anything too important that actually worked before now.  I check for anything else fuse related: 
$ find / -name "*fuse*" -print
AHA! That explains where my MacFuse install came from! It was a stowaway on a tool I had used to support a client project!  Well, expandrive can be trashed with CleanApp. I plug the receipts as well:
$ sudo rm /private/var/db/receipts/
$ sudo rm /private/var/db/receipts/
$ sudo rm /private/var/db/receipts/
$ sudo rm /private/var/db/receipts/
OK. Time to start over. 
$ brew install fuse4x
==> Cloning
Updating /Users/myname/Library/Caches/Homebrew/fuse4x--git
==> Checking out tag fuse4x_0_8_13
==> autoreconf --force --install
==> ./configure --disable-debug --disable-static --prefix=/usr/local/Cellar/fuse4x/0.8.13
==> make install
/usr/local/Cellar/fuse4x/0.8.13: 16 files, 728K, built in 35 seconds
$ brew install sshfs
==> Cloning
Updating /Users/myname/Library/Caches/Homebrew/sshfs--git
==> Checking out tag sshfs_2_3_0
==> autoreconf --force --install
==> ./configure --disable-debug --prefix=/usr/local/Cellar/sshfs/2.3.0
==> make install
==> Caveats
Make sure to follow the directions given by `brew info fuse4x-kext`before trying to use a FUSE-based filesystem.
==> Summary
/usr/local/Cellar/sshfs/2.3.0: 6 files, 116K, built in 10 seconds
$ brew info fuse4x-kext
fuse4x-kext 0.8.13
/usr/local/Cellar/fuse4x-kext/0.8.13 (5 files, 304K)

In order for FUSE-based filesystems to work, the fuse4x kernel extension
must be installed by the root user:
  sudo cp -rfX /usr/local/Cellar/fuse4x-kext/0.8.13/Library/Extensions/fuse4x.kext /System/Library/Extensions
  sudo chmod +s /System/Library/Extensions/fuse4x.kext/Support/load_fuse4x

$ sudo cp -rfX /usr/local/Cellar/fuse4x-kext/0.8.13/Library/Extensions/fuse4x.kext /System/Library/Extensions

$ sudo chmod +s /System/Library/Extensions/fuse4x.kext/Support/load_fuse4x

$ sshfs  ~/mnt/

Finally! Everything just worked! Satisfaction!

Wednesday, November 9, 2011

Going to Compass

A new home page for me, myself, and I, is posted at .

After playing with some ideas about a canvas toy and Javascript managed page, I decided to stop screwing around with it. There is still a "play" page, for silly toys, but the rest of the site uses good-old, straightforward HTML5 content and CSS3.

The intent wasn't to do anything too flashy (as if I'm capable of being that flashy), but to at least show off some of the technologies I use or have used, and to put out a general call to action (hire me!)

After putting up a few more pages I realized I'd be wasting a lot of time futzing around with the CSS rules. The problem isn't one of mere file organization, but of the brutally redundant information spread among the style rules themselves, and the interdependencies among the rules. 

A common software technique for managing increasing complexity is to modularize. Yet CSS provides no meaningful modularization mechanisms.  Splitting style rules into separate .css files is a trivial, almost valueless, proposition, because the cost of keeping up with which rule is (or is not, or should be, or should not be) in which file overwhelms any benefit from separating the rules appropriate to each page.  The problem of repetition of semantic-less constants and interaction among rules also still remains.  

Managing style rules as CSS syntax in files, even with an IDE, is a waste of time. Yes, people pay for it, but it is still time wasted. We can be cleverer than that. 

Compass, or more precisely, SASS, provides a meaningful modularization and refactoring syntax layer. SASS is a kind of algebra for CSS, allowing you to identify patterns and factor out common variables. Compass is the tool that makes it feasible to use. 

First step: get Ruby. Since OSX runs my development machine, and I use RVM to manage the scripting engines and gem components for development, I've already got Ruby. But I do need to chose which Ruby and gemset I'll use:

cd ~/workspace/

rvm gemset create agilemarkup
echo "rvm use ruby-1.9.2@agilemarkup" >.rvmrc

Next step: get Compass. I tend not to like it when components for one project pollutes another project, and hate the hassle of manually figuring out component dependencies, so I'm going to use Bundler to make the gem dependency explicit:

cat < Gemfile
source ''
gem 'compass'

Running bundle will fetch the Compass gem and any other gem component dependencies needed to use Compas' features.

Now, Gemfile and .rvmrc are specifications of the toolchain that supports the publishing process. Those are pretty high value bits, so they should be version controlled. I use GIT, so yeah, that's easy:

echo "Gemfile.lock" >> .gitignore
git add Gemfile .rvmrc .gitignore
git commit -m "add Compass to manage stylesheets"

The Compass gem installed, it is time to set up the basic framework of files in the working tree:

cd .. 
compass create


There is a load of stuff created:

Finally, get Compass to watch the directories and process any SASS files that change:

compass watch & 

That's it! I'm all set for refactoring my stylesheets.

Monday, October 24, 2011

When a UI is too smart for our own good

I'm tearing my hair out at Apple's latest user experience faux pas: gestures on the desktop.

Now, they seem really cool at first. But a few of the gestures really interfere with my experience in incredibly painful ways.

The most painful so far, is the idiotic choice to make a rightward swipe equivalent to the Back button on the browser. This is such a completely brain dead idea that it actually makes me question the competence of Apple's UI team.

Years ago, some other patronizing UI designer decided to make the Backspace key an alternative trigger for the Back button action. The overloaded behavior meant that countless almost-completed forms were at risk of being wiped away by an obvious response to a typo.

Yes, there are programmatic workarounds, but they shouldn't be needed. Backspace-As-Navigation-Button is a potentially destructive action tied to a control that has a completely different function, and that is


Apple not only did not learn the lesson that overloading controls is a really bad thing to do, it copied this  egregious example of pathological UI design and made it worse by tying it to leftward scrolling. 

Fortunately, Apple does give the ability to turn off the gestures. Just go into System Preferences, into the Trackpad settings, and choose More Gestures. Uncheck the Swipe between pages checkbox, and your experience in browser forms will be much less problematic.  

Wednesday, October 5, 2011

Mac Marginalia

I'm fiddling with getting my resume up on my Web site, and getting diverted by sundry tweaks.
Might as well note the tweaks here, as they seem to be handy. 

This Picture Has Nothing to do with the post.
The resume is kept in a git repository. The repos are all kept under ~/workspace (and sometimes mirrored to GitHub, Dropbox and/or pushed to the Web.) 

  • Dropbox appears in the Finder "Favorites" sidebar, but ~/workspace does not. 
  • So I go looking for it repeatedly. 
  • That's annoying. What to do?
  1. Open a Finder window. 
  2. Navigate to ~/workspace. 
  3. Press Command-T to add ~/workspace to the Favorites list. 

  • git status in a terminal shell window at ~/workspace/resume reveals a .DS_Store file.
  • Mac OSX pollutes folders with .DS_Store files to hold Finder information. Call 'em lazy. 
  • That file doesn't belong in the repo. That's annoying. What to do?
  • If you remove the file, it will just come back. That's annoying. What to do?
  1. Go to the terminal shell at ~/workspace/resume 
  2. echo '*.DS_Store' >> ~/.gitignore_global 
  3. git config --global core.excludesfile ~/.gitignore_global


Tuesday, September 27, 2011

"Opinionated" as a Professional Practice

I hear programmers tossing the word "opinionated" around a lot these days.
Usually, it is used to characterize a development library or framework, as in,

Rails is an Opinionated Framework

So why is it that "opinionated" also seems to be used as a euphemism for hard-coding, discrete variables and subroutines, avoidance of data-driven process, or otherwise representing decisions as fixed structures in the code?

An old rule of thumb in engineering is, that the earlier in the fabrication that decisions are fixed, the deeper and more wide-spread is the impact. Regardless of the decision's impact as a cost saving or increase, making a decision early causes a cascade of ripples throughout the system. That's the trouble with being too opinionated: you have already made decisions prejudicially.

Researchers of old used to say that you should build one (or two) to throw away. Perhaps the tendency of programmers to be too opinionated is why. Perhaps it is also why so much of today's hot newest Web coding technologies appear to be legacy code before they even get released.

MacOS X Lion XCode 4 madness

Ah, I'm going utterly insane with frustration. After sitting down two hours ago to start doing some coding, I found that my upgraded Lion system with Xcode 4 has a mysteriously broken command terminal:

MacBook-Pro:~ user$ clear
terminals database is inaccessible
Wha ???

Now, if you go looking at forums, a lot of them are going to say something like "check your TERM" or "try removing your dot files temporarily", or some such. That's misdirection, based on someone hacking together a Linux environment. This is OSX, and I haven't been hacking my environment recently.

Seems that either the Lion upgrade or Xcode 4 screwed something up. An Ubuntu posting on Stack Overflow discusses the cause of the problem, a missing terminfo file (/usr/share/terminfo/78/xterm-256color). I go into Time Machine to investigate, and lo and behold, there is no /usr.


Mac OSX apparently hides UNIX files. Where's the UI setting for that? Um, well, the folks at Apple decided that magically hiding files should be managed by a magical switch:

MacBook-Pro:~ user$ defaults write AppleShowAllFiles True
MacBook-Pro:~ user$ killall Finder

(Really, Apple? Why is there no UI? The man page detritus has no indication of what the actual settings are. So much for UI discoverability, but at least some of the info is documented, and thanks to Google and places like Stack Overflow it can be found.)

I go to restore the file, finding it a couple of weeks back. Unfortunately, Time Machine still won't show UNIX directories in its file browser. The original location (/usr/share/terminfo/78) isn't visible as an option. I copy it manually. Stupid.Magical.Finder.

Note to the person on the Xcode 4 team that decided to do the cleaning up: multiply the time wasted that by all the developers you tripped up, and your one mistake probably comes out to hundreds of hours, not to mention the systems that now have crippled terminfo files.

Thursday, September 22, 2011

Nokahuna and That Other task manager

The name of that other task manager is Trello. Even though the name is forgettable, it offers some interesting features. Things like auto-updating, and a very strong visual metaphor.

In contrast, Nokahuna has a kind of 1950s style - the pastel lime green made me think of the worn out Formica of some old diner out in the middle of some old, mostly abandoned downtown.

To be fair, Trello isn't visually stunning itself. But it does offer a more chunky version of the todo list. The metaphor is one of pinup boards with cards stuck to them, and highlighter colors used for labeling. It is not an unpleasant departure from the creamy, everything-runs-together-until-your-eyes-bleed shtick that dominates almost every project management system and todo list manager under the sun.

Unfortunately, in it's current incarnation Trello doesn't work well on my iPad2. There was a "transport unavailable" error on every page... I doubt that the auto-updating feature works as expected... And whatever they did to code up the browser UI, it has jerked up the touch events to the point that every action requires at least two touchdowns to register one.Some things that were pointed out in the welcome board simply did not work, like Drag and drop, or the elusive user manual (um, where is it exactly?)

That brings me to the last feature that seems to be missing: the ability to delete your account. Trello doesn't give you that option. It seems that some users adopted the practice of appending "delete" to their screen names. Maybe that is in the missing manual.

So I turn my attention to Nokahuna. First issue up: the welcome screen offers me (an iPad2 user) to view a Flash based screen cast. Not exactly the best first impression.

The rest of Nokahuna reaffirms my initial style impressions.... This _is_ an old dive. There are no frills in Nokahuna, and for the most art that is a good thing. It is a task list manager, and that is it. Teams with fixed functional roles for people may find the minimalism a little too simple, but I can see the appeal to Agile folks who collaborate as independents or in a tiny cross-functional team.

For visual metaphors, Nokahuna is much weaker than Trello. It is unremarkable and like a dozen other tiny todo list tools I've seen, hacked, or used in the past 20 years. This isn't a remark about the minimalism, but about the absence of pleasant variations and of the tired look to the thing. For all the weird stuff that cropped up on my iPad2, Trello is still more attractive.  A name like Nokahuna raises expectations a little higher, and the visual appeal of a list just doesn't meet that expectation.

Summary: I'm sure both work well enough on a desktop browser. Trello aims at being multi-platform, but doesn't seem to have reached that goal in practical terms. Nokahuna is the more minimal of the two, perhaps too much so; Trello is more feature filled, almost trendy. Minimalism aside, Nokahuna still needs a facelift. 

Wednesday, September 21, 2011

Conundrum, a sample of UTF-8 in Ruby 1.9.2

Posted to Github as this gist.

# encoding: utf-8
# conundrum.rb
alias :λ :lambda
alias :Ω :abort

module Enumerable
alias :⇔ :collect
alias :∉ :reject
alias :∈ :select
alias :∫ :inject
alias :∀ :all?
alias :∃ :any?

class Array
alias :× :each
alias :⊠ :each_index
alias :≡ :eql?
alias :∋ :include?
alias :∪ :|
alias :∩ :&

%w(a b c).× do |letter| puts letter; end

%w(a b c).∪ %w(x y z) do |letter| puts letter; end

%w(a b c).∩ %w(c y z) do |letter| puts letter; end

a = λ {|s| puts s}'test')

Ω "It is the end"

I miss my Spaces

I'm sure the NC DOT has its reasons for designing roads that induce drivers to switch lanes constantly or be forced into turn-only lanes. But having a reason is not the same thing as making a good choice or making acceptable trade-offs.

I'm sure Apple has its reasons for removing Spaces from OSX Lion and replacing it with Mission Control. But they went all NC DOT in the process, forcing desktops into a single lane. The result is a system which is more difficult to navigate and less functional than its predecessor.

How can the seemingly simpler Mission Control be harder than the two-dimensional Spaces? Well, Mission Control isn't actually simpler. Like an NC DOT onion-skinned road design, the user is forced to shift their attention from getting to an objective destination, to a forced choice problem of avoiding undesired exits.

Forcing choices on people in an interface is good when the options are potentially equally valued, and you need to determine a preference. Forced choice is a terrible thing when there is no need to determine the preference, or when there is no preference.

Spaces could be used to eliminate the need to search, by using the brain's natural inclination to remember details based on physical location. Mission Control turns the direct access ability of Spaces into a linear scanning process.

I'm sorry, Apple, but Mission Control is a big time FAIL.

Tuesday, September 20, 2011

Interesting Ruby Tidbits

In Ruby,

@@var class variables are shared between the class, its instances, and any extended classes; thus they act like globals within a superclass' hierarchy.

do...end blocks bind at a lower precedence than { .. } blocks; thus,
p {|s|   s*2 }
p do |s| s*2 end

"string"  is equivalent to %(string) and %Q(string), but the latter two allow nesting of quote characters without special escaping

To get a list of user methods from a subclass, use the cls.methods method and subtract the base class method list from the class method list.

If you use the default argument to"default"), the value gets shared among all defaulted hash entries, so assignment to any of them will change all of them. Use the block form of default initialization instead: { |h,k| h[k] = "default" }

Friday, September 16, 2011

To Quote or Not Two Quotes, Taht Iz Dah Qvestion

In C, when one needed to insert a special character or ensure a regular character was interpreted as-is, one would use the backslash character (for those of you in Rio Linda, that's the '\', not the forward slash '/'), like so:


In Ruby, supposedly you can do the same thing. The exception is that in Ruby, the action of the backslash is restricted within single-quoted strings. That is,

" this is a quote: \"" (length 18) 
' this is a quote: \'" (length 19)

Be mindful of your escaping, you could be escaping too much, depending on the type of quotes you chose. This often caused trouble in our RSpec tests, particularly when Capybara "have_content" matchers were involved.

Wednesday, September 14, 2011

Once Bitten, Twice Shy

Got bitten but the Ruby Constants Are Not Constant bug again. Actually, it was in the same part of code that I'd found it before, only dealing with a portion of a hash a bit more deeply nested.

 :elem1 => { 
   :props => [ :a, :b, :c ] 
 :elem2 => { } 

hash = SOMEHASH.clone 
  # replaces SOMEHASH[:elem1][:props]

hash[:elem1][:props] = hash[:elem1][:props].shuffle  

I know, you Rubistas are going to say it isn't a bug; that it is just my naivete; that in Ruby, deep structures like nested arrays and hashes contain references to objects that do not change but the objects that are referenced can mutate.

What is happening here is that the anonymous hash (the object pointed to by the key :elem1) is not being copied. Its reference is what is copied. So shuffling and replacing one of its members (:prop1) necessarily changes SOMEHASH[:elem1].

One approach is to freeze the hash, but you'll end up forcing your app to abort when it tries to manipulate values. In our app, we were creating copies of elements and shuffling them. The problem being that our copies weren't really full copies: they contained references to the objects still in our original "Constant" hash.

And Matz might argue that it is not surprising, once you understand the details well enough. But it is still undesirable. I find many aspects of Ruby to be elegant, but not this one.

A more gruesome, brute-force approach is to deep copy the hash by serializing the data to a stream, and marshaling it back into a new hash object. It is a one-liner, but that one line is doing a lot of work in order to copy an hash.

def deep_copy(hash)

Sunday, September 11, 2011

PHP Considered Inelegant

I'm looking at this year's SparkCon (2011) and noticed that the server had vomited on the sidebar. SparkCon's site apparently uses PHP to parse XML, and the parser threw an uncaught exception:

Warning: SimpleXMLElement::__construct() [simplexmlelement.--construct]:
Entity: line 1: parser error :
Start tag expected, '<' not found in /home/sparkcon/www/www/wp-content/plugins/gcal-sidebar/gcal-sidebar.php on line 369

Well, I'm not picking on SparkCon - it looks like a fantastic set of events - but increasingly I'm feeling less and less tolerant of those kinds of fit-and-finish flaws showing up when I'm interacting with a Web site.

Imagine what you would feel like if you were meeting a new business associate to chat at a cafe, and while you were talking they said "hold on...", unzipped, reached inside their undergarments to adjust themselves, scratched around for a while, and then tried to resume with "ok, go ahead".

We are all human, but some people are just more circumspect than others. That's why, intuitively, PHP rubs me the wrong way: inelegance.

Update: I'll give two examples. First, PHP's over-reliance on special characters and strings of special characters as operators in the syntax (from its Bourne shell -> ED/AWK/SED -> Perl heritage); and second, its concomitant reliance on gruesomely ugly idiomatic expressions for expressing trivial relations and operations. Neither of these PHP characteristics adds value to the solutions; both detract considerably from the readability of the code and add excessively to its code length. Since code-length is a correlate of error injection rate, PHP is objectively a worse basis for making an investment in code. 

I'm not sure if PHPs inelegance is entirely justified as a conclusion, but PHP isn't alone. PHP lacks the charming Rube Goldberg contraptions of Ruby metaprogramming, elegant to some but a sham to others. Perl is its ugly older brother, so ugly that it almost goes all the way around the ugly clock with one liners that appear elegant for how tightly they can compress their ugliness. And as bloated as the JVM platform has become, Java was not a particularly elegant language even when it started. JavaScript is like a fractured gem: turn it one way, and it looks elegant, turn it another and the flaws ruin the illusion. And then again, the apparent elegance of a language is not always sufficient to offset poor run-time performance.  PHP is just the scab I'm picking at today.

See also this tongue-in-cheek comparison

Friday, September 9, 2011

Passion Considered Harmful

I had a discussion with a colleague recently, and she was telling me about work in the university. As we parted ways, she quipped that for those university jobs, one really "needed to be passionate about education."

I'm sure that she was serious, and in some ways it is true. I'm also sure that there over 50% of the people working in those jobs who could be said to be less than passionate. Middle-manager dreary even. I've seen those people at work, and the utter banality of their expressions is sometimes just torturous to watch. So naturally I'm a little confused by the apparent contradictions.

My question is, why?

It is important to be committed to what you are doing. My question is why is it necessary to be so passionate that you never stop to think about whether it is important or good. We could save a lot of really worthless economic activity, not to mention heartache and grief, with a little dispassionate introspection.

Should we encourage teens to join the sex industry, since many teens are passionate about sexual activity? Passion is not the source of commitment.

Nor is passion a necessary outcome. I want my doctor to be keenly interested in his profession, to be dedicated and deeply involved in whatever it is that he specializes in. But I don't want him to have an unusual excitement, enthusiasm, or compelling affinity for his mode of treatments above alternative protocols that are equally valuable.

A doctor's attachment to his specialty should not cloud his judgement about your specific situation. Passion is a fog to judgement, a useful motivator but deadly without restraint.

Besides which, people lie about being passionate. They especially exaggerate their passions when it is perceived to affect their job prospects. Sometimes, a pretense of passion can indeed turn into the real thing. Yet the incessant drumbeat for passion has become such a common refrain that it has corrupted and colored the very message it was meant to filter.

Professional actors are paid to present a pretense of passion. Such passion is hollow at best - a clever deception for our own amusement - but it is all too common to find people who can act a passionate role with excellence but have little value for honesty and integrity.

In marriage, passion without honesty, integrity, and commitment is just a precursor to divorce, or worse. Passionate actors poison their relationships.

Passion shouldn't be an acid test. It shouldn't even be the first thing you look for. Look for healthy relationships instead. When you find them, the passion will grow out naturally.

Thursday, September 8, 2011

Marginalizing Your Peers

I'm reading a blog by S. Iannarino, in an entry titled "No Garbage In, No Garbage Out." Now, Iannarino's material is a little above the fold compared to some sales and marketing pieces, but it is still more pop psychology than rocket science. Among the pithy aphorisms and exhortations, Iannarino repeats a few recommendations I've heard often. Just as often, they give me pause.

One of his exhortations is "avoid negative people." Now, on the surface, it seems quite reasonable, even almost natural. An uplifting environment and the comfort of intelligent, positive peers is definitely better. But it has always bothered me a little bit that what the pop-positive-psych-preachers propose to get there, is essentially that we abandon those most in need just so we can protect a personal pie-in-the-sky mental state.

Think that's too extreme? Am I being too negative?

What does this recommendation say about how you should widows, orphans, poor, under/unemployed, sick, those coworkers nobody really knows well, or some minority class of the disenfranchised? No, I'm not writing here of abstract groups who see themselves as victims, but those real people that you contact that actually have stresses and struggles. Their coping mechanisms don't always compensate.

What this advice says, essentially, is "Let them go to Hell, so I can pretend I'm in Heaven."

Selfish rule making like this is one of the reasons why a Sales and Marketing mentality is so bereft of ethical standards, and why honesty, transparency, and trust are so much harder to come by in the business world. By narrowing their own focus based upon their own personal dogma, they marginalize their neighbors.

They are also probably discard the wisdom of more realistic minds. A study I read suggested that depressive tendencies in kids is usually associated with a more accurate self-assessment, than of more up-beat peers.

Put another way, knowledge isn't always as uplifting as it is made out to be. The fruit of the tree of knowledge may have been a way of opening the eyes to good and evil, but it also had the effect of excluding its consumers from a garden of ignorant bliss. Ignoring people who complain may be wise; but conversely you may also be ignoring their wisdom.

There is a place for assertiveness when dealing with people who have trouble coping. But it is not assertive to avoid people just because they challenge your world view, force you to consider risks, account for costs, or consider consequences. No, that is far closer to passive-aggressive cowardice.

There is no particular place in Hell reserved for those who choose to close their own ears to other's sorrows. As Jesus said in the Book of Luke (10:25-29), empathy is required for salvation. So if you're a believer who follows this sort of guidance and marginalizes people out of a sense of protecting your personal belief system, your belief isn't a free get-out-of-hell pass. Even if you're not a believer, there's no magikal taboo protection to be had in that sort of positive psychology.

Don't Abandon. Learn to cope with the negativity. Help others to see past their own limitations, and mentoring those who can accept it by teaching them better coping mechanisms.

Tuesday, September 6, 2011

Running cars on zinc

The federal government has been pushing hydrogen research, and after looking over several research papers I'm astounded at how complicated approaches seem to take all the attention. It is as if people are more interested in playing with science than with solving practical problems.

One concept that caught my eye was an Israeli solar project, which proposed a zinc/zinc oxide/water/hydrogen cycle. Even there, extreme heat is used in the process of reducing water to hydrogen. Six hundred degrees may me manageable, but it isn't really necessary.

But is the zinc process practical? Well, a gallon of gas is equivalent to about 1kg of hydrogen. Hydrogen weighs about 1.01g per mole, so it would take about 990 moles of H to get the power of a gallon of gas. You need one mole of zinc to release two moles of hydrogen from a mole of water, or 495 moles.

Since zinc weighs in at 65g per mole, you'd need 32178g of zinc to liberate enough hydrogen to equal a gallon of gasoline. So now we are talking about 70 lb of zinc in a powdered or slurry form. That's not counting the weight of the water you'd have to consume. The weight of 495 moles of water is about 8910g or 20 lb.

It would take 10 gallons to provide for a typical commuter vehicle. Carrying around a 900 lb tank of zinc and water doesn't seem at all a good trade off to a 10 gallon gas tank. Gasoline itself weighs in at about 90 pounds.

We could do the same sorts of computations to find that aluminum or magnesium would cut the weight by somewhere around 2/3rds, or around 300 pounds, which is certainly more competitive, although either metal would require more energy to reclaim.

Friday, September 2, 2011

Git's Poor Command Line Habits

Tom Lord's Arch, or 'tla', was one of the first open source distributed version control systems. It was widely criticized as having overly long names and convoluted command line interface. Linus's Git has shorter names, but the command lines can be just as perverse.

Take, for instance, the command to set the repository back in time by one commit:

git reset --hard HEAD^

OK, what's up with that? It was not necessary at all to introduce a idiosyncratic tree walking syntax just for Git ?

[edit: not to mention, git reset is as dangerous a command as rm * for pretty much the same reason.]

In Git, the HEAD^ shows the parent of the HEAD commit. But wait, there's more!
Use HEAD^^ to see the parent of the parent of the HEAD commit, if it exists, or HEAD~3 (that's a tilde, '~', not a hyphen) to show the great-grandparent.

There's not a lot of consistency to the various command line interfaces in Git; most of the commands make use of a mix of positional arguments and labelled parameters and options, which can make the commands rather arbitrary and thus needlessly more difficult to remember. The choice of command names is also rather spurious.

Git has some good qualities, and lots of documentation. But lots of programmers keep complaining that they can't remember what that particular command was they were looking for, and it is evident that with Git it is difficult to infer for many of us to infer the proper semantics based on the syntax of the options. It's not us, its you, Git! You're hard to remember!

Wednesday, August 31, 2011

Configuring a GIT controlled Web site

This is assuming you're using a *nix setup for the server, and Mac OSX for the workstation.

First, register your public key with the server's SSH. On a shared host, this would typically be under a control panel icon for Shell Access or SFTP/SSH . Consult your hosting service for details.

Suppose your site is "" and your user name is "jonathan".
You should be able to log in to your site. Open a command terminal window, and:


If you can't do that and get a shell prompt, you're hosed. There are ways around it, but that's another post. Knowing how to navigate this level of complexity is a sort of prerequisite... you don't need to know, just know how to find out how to do this kind of stuff.

So, you are logged in to your server, and you create a subdirectory for your repositories, and a directory for your web site or app:

mkdir -p ~/repositories/

You initialize that directory as a "bare" git repo. (A bare git repo has no modifiable source files present, just the repository metadata and data files). This will be the repo you will stage changes to immediately prior to deploying:

cd ~/repositories/
git init --bare

Now, you switch to another terminal, either another window or a tab. Create and/or change your to the working directory for your lili.pops site:

mkdir -p ~/workspace/
cd ~/workspace/

Now, at this point, we're assuming that there are oodles and oodles of cool source files with names like "index.html" and "flavor_picker.js" just sitting around in ~/workspace/

We can (we should, nay, we must!) turn the directory into a git repo (a non-bare repo with source files, that is):

git init
git add .
git commit -m "initial commit of content"

This is the really weird part:Git links up repositories very loosely. Git keeps data about the sets of files that are changed, which it calls a "commit". These commits can be pushed by you to bare repositories, or pulled by people working with you into their own repositories.

So people contribute independently. Note also, that it isn't necessary for everyone to have complete and utter exposure to all of your files.

Let's link our repository to the bare repository on the server:

git remote add staging ssh://

# (Uhhhh.... long line folding is making this look wrong. It should all be one line.)

What we are doing here is simply making an association between a local label "staging", and a url that points to the server's repository. In this case, we're using the SSH protocol to make the link. Since we provided a public key, we aren't going to be asked for a password in the next step.

The next step is to set up a branch on the remote that will track our local main line branch, and copy our changes to it:

git push staging +master:refs/heads/master

The "+master:" part means "create a master branch on the remote repository".
The "refs/heads/master" part means "compare the remote branch's references to references from the local head master branch".

Once that association has been made between the local and remote repositories, the command to push our changes is much simpler: we just refer to the label we used ("staging"):

git push staging

Go ahead, repeat that command. You should see:

Everything up-to-date

At this point, all we've done is pushed changes to a remote repository. No one can see our files because that remote is "bare" -- there is no working source tree. The most obvious way to deal with this is to go to the server and ask git to pull out a copy of the working files into some web root location.

Switch back to your SSH command window, and checkout the changes on the server:

GIT_DIR=~/repositories/ GIT_WORK_TREE=~/public_html/staging/latest git checkout -f

Here, I'm assuming that you use public_html/staging/latest as a location to push your changes to before they've been finally released.

Now, that's all that is necessary, but we can streamline the process by automating that last step. There are two ways to do this. One is by way of a shell script, which I'll call "deploy":
user = ${1%%@*}
server = ${1##*@}
branch = $2
git push $server $branch
ssh ${user}@$server "GIT_DIR=~/repositories/ GIT_WORK_TREE=~/public_html/staging/latest git checkout -f"

Well, that's a rough guesscript. It looks like it would work, but I haven't tested it.

Another way is by a Git "hook". Go to the SSH window, and run the following:

cd ~/repositories/
cat > "hooks/post-update" <<EOT
GIT_DIR=~/repositories/ GIT_WORK_TREE=~/public_html/staging/latest git checkout -f
chmod u+x "hooks/post-update"

Now, whenever you do a git push staging, the post-update hook on the server will do the checkout for you.

Note: if you are deploying applications, you probably already use a .htaccess file or alternatives to hide or block access to resources that aught not to be exposed on your site.

Sunday, August 21, 2011

A case for web site Minimalism

It has been several months since my old Joomla based site was hacked. At the time the workaround was to put up a static placeholder page, with minimal contact information on it. It seems as if the time has come to revisit the site, now that I'm again on the prowl for more work.

While I endeavor not to lose focus, too many questions arise. "Why?" for instance. My Web site has never been a source of clients. For technical communication purposes I blog here, and for project work I use external services like github. There is just a little bit of lie to the oft told tale that a business "needs a Website".

What a business needs is a reputation. A Website is just one location people expect to help them inform their own opinions. Most people don't bother to read, let alone dig around for information, so unless a bit provides some meaningful novelty that contributes to your reputation that bit should be discarded.

And that's just information on a static site. Bits in database driven CMS sites come with much higher initial costs, a greater ongoing maintenance burden, and unmeasured risk exposures. Is a PHP content management system a worthwhile expense? Maybe. Is it an investment in an asset? It could be, if your brand is worth something. But from a technology perspective most CMS systems present more of a liability than asset.

Thursday, August 18, 2011

Debugging Consoles with Ruby

In ruby the debugger is not nearly as nifty as the console. Fortunately, there is, a gem that provides an embeddable console.  In the simplest terms, Pry provides real-time access to your program's state through a Read-Eval-Print-Loop, much like Rails' console.

I've been a shell programmer for most of my working life, but I've never, ever, ever, been a fan of REPL user interfaces. They suck.  Readline interfaces are fine for shell tools, but there's no particular reason the user interface has to be line-oriented.

One of my first computers, an Atari 130XE, had an interesting editing interface for prototyping code.  It kept lines in a buffer, and supported both immediate execution (of un-numberd lines) and editing (of numbered lines).  Buffering, baby, buffering!  That's the difference between a 30 year old 8k BASIC console and a modern Ruby console: in-place editing.

But wait -- we can get a first approximation of in-place editing by cleverly hooking in an editor component. A quick search turned up two possibilities:  and . I gave the interactive_editor gem a try.

To summarize, in a Rails 3.x app, you would just add the two gems to your Gemfile, run bundle, and then embed the expression


in your code where you want the console to open.

There are a few other details. The vimcast has some useful tips on setting up the interactive_editor, for instance, to make sure you get nice colorization turned on. But after that, using it is a breeze. Just use


to open a vim editor into a temporary buffer. When you save and close, the console will run your code. Running vi again will reopen the last temporary file. It's a VEPL,  a Vim-Eval-Print-Loop.

But even more interesting for debugging, you can use the editor to work with objects through YAML representations.  Just add ".vi" to the end of an object, and your vim editor will open with the YAML. For instance, in UserController#update, I might do:

You get something like this:

  1 --- !ruby/object:User                                                                                                                                                             
  2 attributes:
  3   id: 2
  4   email:
  5   encrypted_password: $2a$10$s/NKiMEC1UfHzgG5JlCCJuHpZbAZ77dz623rq6gt12YUqW7RpvoWW
  6   reset_password_token:
  7   remember_token:

and so on (except, nicely colorized if you followed the vimcast). 

Not only is it much easier to look at and inspect than a puts or the debugger's "p" output, but the YAML is a mutable representation. Editing it, saving and closing will modify the in-memory object. You could do this sort of hot debugging already, but the editor makes it a lot easier to visualize, and lets you tweak multiple properties in one step without having to type any code.