Monday, November 18, 2013

Health Insurance is not Health Care.

Please, stop using the absurdly confused premise that health insurance equates to health care, or even quality of health care.

Conflating government-regulated insurance scams with health care is like confusing federal educational funding with a student getting a real education. Beyond having enough funding to keep a student safe and happy enough to learn, and to attract quality teachers, everything else is just a frill.

Funding is certainly an issue, but the effect of all this knee-jerk reacting by neglectful and passive-aggressive leadership at the top, is that influence peddling rules. Medicine is not being practiced by professionals who understand their business models or the practice of their trade. Ask a doctor how much a common procedure will cost, and good luck getting a straight answer. Ask a doctor to show up on time for a scheduled appointment, and good luck spending any meaningful time developing preventative strategies. The profession and the business models of delivery have been undermined so much by reacting to insurance requirements, that they no longer have a clue.

Ask a good roofer to show up for an appointment, and they're going to be there early if at all possible. Ask that roofer for an estimate, and they'll inquire in detail about your actual needs and give you an estimate you could take to the bank - or home insurance company. Go to any number of good trades people, and you can find examples. There are plenty of amateurs and pretenders, but real trade professionals have a clue.

Patients are no longer the customers. The customers are now government institutions and insurance companies. They pay the bills, and they dictate the terms under which they will allow transactions to occur. What incentive is there then to keep costs down, or for that matter to get patients healthy: the government and the insurance companies get an increasingly larger slice of the pie as the systems grow more convoluted and inefficient, requiring ever more management and external regulation.  

The fault for the currently messed-up state of the profession can be put squarely on the formation of the 30+ years of insurance-owned HMO/PPO networks. These attempts to move toward "managed" health care under the thumb of insurance companies screwed up every facet of the profession, from the doctor-patient relationship and billing to the ethics of how, when and why treatment is delivered.

Thursday, September 19, 2013

Does Apple know about usability?

If Google is the king of unusability when it comes to oversimplified interfaces, Apple is right up there with 'em.

I'm trying to get Pages to do simple things. Text wrap. Select a line of text that happens to have an email address in it. Not open an email window just because I am editing a document. Un-apply the email address link that it automatically applied to text that looked like an email address. Within five minutes of starting Pages, I'm running into things that should be simple to fix, or should not even be issues, which Pages either makes excessively difficult or impossible to approach.

Apple, here's a hint, and I'll spell it out so you can understand: make easy things E A S Y to do. Make complicated things predictable, and make obscure things, at the very least, possible.

Users shouldn't be driven to seek help on a community forum just to work around how your clever auto-markup prevents the mouse cursor selection from working.  Hell, even VIM, the demigod of arcane editors, has Pages beat by a long mile when it comes to usability.

Jobs is gone, and he left an important legacy behind. But the rest of the company needs to move on, and innovate. At least then, even if the tech still isn't usable, I'll have something worth learning.

Wednesday, August 14, 2013

CSS: Reality Lies To Us

Well, CSS media queries don't really suck, they just aren't expressive enough to allow a designer to concisely specify the intent of the design, or even to specify intent at all. So, OK, they suck a little, but not totally.

SASSy CSS and LESS CSS compilers can help, but they are mainly server-side technologies. Yes, your browser can be tricked out to run a compiler against them, but until the syntax is embedded in the client that's more of a hack than a deployment technology.

Furthermore, SASS and LESS don't alter the execution context of the decision making in applying CSS stylesheets. The languages support computation and refactoring of sources, but only using a-priori knowledge: discrete facts available outside of any particular deployment context. This is a intentional feature of CSS, not a bug, but the manner in which CSS media queries are written degrades the expressiveness with respect to the designers intent.

For example, why do we write media queries that ask:

@media screen and (orientation: landscape) and (min-device-width: 1024px) and (max-device-width: 2048px) { ... }

... when what we mean to ask if whether or not this is a touchpad similar to an iPad in size?

One thing we could do is use a Web standard. How about the HTML "handheld" media type?

Virtually no mobile device will trigger the "handheld" rule. The first phones to implement the handheld media type were crude affairs, and rather than do the right thing when they rolled out new devices, vendors of smart phones decided to put up their collective middle finger to Web standards and have their browsers lie.  They all claim to be monitor screens, which, by the way, they are not.

It turns out that "handheld" does not really tell us it is a touch pad device anyway. Handheld only tells us only that it is a mobile device. We also have touch screens that are static devices, like Point of Sale terminals, kiosks, and appliance-style surfaces. Even if vendors complied with Web standards and supported "handheld," our pages still wouldn't know if the device supported the touch screen modality and/or provided WIMP cursor-based interactions.

CSS, as implemented by vendors, has made us compound errors upon error. It isn't just imprecision, it is an emphasis on irrelevant precision and a focus on the wrong parameters. Designing using quantized units, such as ems, goes part way in mitigating some of the over-constrained precision. Yet CSS still does not allow us to write what we mean, and still forces us to make assertions that are not correct in order to trip the machine into behavior that looks more or less accidentally correct as a side-effect.

Here's another example: CSS goes out of its way with max-backwards syntax so as to avoid less-than and greater-than operators. This is outstandingly silly as we go forward with HTML5. This is technology: instead of avoiding math we should embrace it.  The algebra isn't really that scary:

@media touchscreen and (52ppi <= resolution <= 104ppi) and ( 7.31in <= device-size <= 9.5in) { ... }

Thursday, August 1, 2013

Software as an Abstraction vs Software as a Practice

This past few months has seen quite some turnover in some groups at my workplace, one an administrative team and the other an I.T./Web-support team.

It got me to thinking this morning. While some churn is unavoidable and at a low level is probably good for an organization, it can also be corrosive when a proportionally large chunk of the team just goes away. It eats away at the knowledge and experience embodied by the team as a whole.

We like to think of operational software as some sort of abstract machine. Yet software is also what a team does to reflect upon, institute, and refine its own knowledge and behavior: it is a professional practice.

So when a bunch of people up and leave an organization, the practice can break down.

Saturday, July 13, 2013

Travis CI: a chink in the armor

My employer is considering Travis CI and I have input to our process, so a couple of weeks ago I decided to experiment using an open source project.

I ran into a roadblock when I initially forked a project and tried to add it to my Travis CI builds: the variables in the config file were all wrong. But even worse, the variables were all encrypted, so there was no way to learn by example from the original settings let alone compare and contrast them to a user guide or reference.

Yet this is just an opportunity to check out the Travis CI community. So I posted to a Google Group, travis-ci:
I am just starting with Travis-CI and a little confused about the use of encrypted variables in a .travis.yml file. 

Suppose a repo is forked on Github, which contains a few dozen - secure: entries, 
and that there is no record of the names of the variables that have been encrypted.

Is it possible to determine the variables' names (or really, the role they play in the build), 
for the purpose of updating to whatever makes sense relative to the forked repo?

Sorry for the newbie question, but it seems like it would be pretty hard to carry forward a sane config unless the original variables were documented in some way. 
But, to quote The Walrus and The Carpenter, answer came there none. 

Travis CI should take note: a community that doesn't know how to respond to a recommended practice of throwing away source opens a window of opportunity for a responsive competitor to offer a better alternative even if it is just by way of a best practice. 

The pragmatic answer could be "we don't know" or "we are ignoring you because we don't know you," or possibly some combination thereof, but I'm inclined to believe the pragmatic answer is "no".   

Bottom line: It is a rather insane practice to rely on fully encrypted code as if it were source.  The config file isn't source - it is a build artifact of a source template, parameterized by instance variables. It is the latter that should be encrypted; the former is needed in order to reproduce the config file artifact within a different context (like a fork). 

Saturday, June 29, 2013

Building Angular-UI Utils

I tried again, doing a clean install of homebrew, node, and npm.

git clone

cd ui-utils

bower install

grunt build      # Fails with errors about stuff not being installed locally.

npm install grunt-contrib     # Fails because of crap in npm's cache...

That the local npm install failed was strange because I just cleaned it before reinstalling yeoman, bower, and grunt. But whatever -- cache clean is your freind.

npm cache clean

npm install grunt-contrib-watch
npm install grunt-contrib-uglify
npm install grunt-contrib-concat
npm install grunt-contrib-clean
npm install grunt-contrib-jshint
npm install grunt-contrib-copy
npm install grunt-karma

grunt build

Running "concat:tmp" (concat) task
File "tmp/dep.js" created.

Running "concat:modules" (concat) task
File "components/angular-ui-docs/build/ui-utils.js" created.
File "components/angular-ui-docs/build/ui-utils-ieshiv.js" created.

Running "clean:rm_tmp" (clean) task
Cleaning "tmp"...OK

Running "uglify:build" (uglify) task
File "components/angular-ui-docs/build/ui-utils.min.js" created.
File "components/angular-ui-docs/build/ui-utils-ieshiv.min.js" created.

Done, without errors.

Finally, no errors, but... where are the build files?



Seriously? Maybe someone punted.
At least the grunt build worked.

Aside from making a clean install of node, npm and several other node modules globally like yeoman, bower, grunt, coffeescript and coffeelint, this is what it took to build the ui-utils module.

Tuesday, June 25, 2013

Picking up AngularJS (or, NOT picking up Angular-UI)

I've been trying to get up to speed and moving fast with AngularJS recently, with only limited success.

The main problem at this moment in time: Angular is unstable. It caused my browsers to crash or lock up erratically about one out of ten reloads. We're not talking about logical flaws causing infinite loops mind you. The browser's memory gets so demented that that it has to be force-quit and restarted.

Well, that's on Chrome. Safari is worse, if for no other reason than that the developer tools suck more. Firefox was more stable but still suffered the occasional seizure.

The concept is great though. The management of state between controllers, scoped models, and directives with your own Javascript objects and methods, presents advantages of clarity and modular cohesion over 100% imperative coding, and page-flipping apps with one-off ajax user interface features. It is enough to make me keep looking.

OTOH, the broader a scope of Angular tech I look into, the more I get bogged down in pieces that just don't work.  Angular-UI is one such adjunct project. The ui-utils-validation module is either completely broken, incorrectly documented, or both. The demos don't do anything.

After poking through the repo for a while, and incredulous that tests are actually passing, I fork Angular UI. Apparently, the project is in the midst of some reorganization, because the build info in the is incomplete. Grunt complains there's no gruntfile.js; perhaps it is grunt.js instead? A rename and... Grunt complains of tasks not being found.  Anyway, it isn't working.

I figure maybe I'll just run the tests. Karma won't start because Firefox and Chrome aren't in the executable path. I've seen that before and somewhere I've got the workaround to fix it,  but I'm not going to bother. I've just realized I'm shaving yaks, and it is time to put the clippers away.


Well, I went back to poking around with the validate module. The nearest I can determine is that the current Angular-UI validate isn't working at all with the latest Angular. This I already knew, but one note to make is that debugging third party Angular code when it goes wrong, seems like it would be a seriously painful exercise. This is really still just Javascript after all, and despite Angular's best intentions there seems to be quite a lot of "action at a distance" going on. At least, enough for a simple module to clobber a whole page.

[Update Again]

So I tried a few things. Chrome was croaking like a toad in heat every time I opened the ui-utils demo page at  Aw Snap!  I shut off ALL of my Chrome extensions, every last one, and restarted Chrome. That, at least, let it sort of load properly. The demos didn't exactly work right, but it seemed more of a logic error than the songs of mating frogs.

Finally able to read the code behind the screwy demos, it appeared that the "utils.js" file I had been including was completely and utterly insufficient: the demo includes a file "ui-utils.js", which appears to be the concatenation of all of the modules' js files, with utils.js up front. So what to do? Build!

The first real hint that something was wrong was when Bower complained I was using an outdated file, "components.json," instead of the "correct" name, "bower.json."  The Yeoman generator was apparently written while the Bower team was playing musical chairs with the file naming conventions. So, I'm not sure at all if something else is out dated or I have the wrong versions of Yeoman, Bower, Grunt, or whatever fubar component... the software is a moving target and package management, while there, lacks closure and consistency over the required artifacts.

Pressing on -- who knows what will work at this point -- I run "grunt build". Ermagherd, it tells me, grnt.jrs isrnt thr! More musical chairs with croaking toads, so I ln -s Gruntfile.js grunt.js  and rerun the build. Now it tells me "Warning: You need to have Ruby and Compass installed and in your system PATH for this task to work. More info: Use --force to continue." Urk. I have Ruby, let me RVM that: rvm use ruby-1.9.3-p0@sandbox and gem update compass. Building works, but doesn't do anything interesting.  It is looking like the Yeoman scaffolding has to be mucked around with to integrate in another grunt-file built component.... stay tuned.

[Final Update]

I thought, well, I'll go back to the sources. So I cloned angular-ui, and ui-utils, and found that the build process experience was a train wreck out of the box. Lots of tools aren't specified, and the ones that are obvious are misconfigured from apparent version drifting and file name conventions shifting.  Attempting to update npm/yeoman/bower/grunt resulted in tripping over holes in their respective configurations, particularly npm and grunt. So I'm giving up on Angular-UI - it isn't even ready for alpha, let alone beta, let alone prime time. Maybe in a year or two...

Incidentally, the pothole with grunt was that they stopped distributing the command line tool with the main package "grunt," so one must now also install the "grunt-cli" package.

Monday, May 20, 2013

Cultured Software, Part IV

Continuing my thoughts upon the subject of fixed points, geometrically a fixed point is any point that remains fixed when a figure is transformed. For instance, under the operation of rotation the center point of a circle is a fixed point.

Algebraically we look for elements that remain unchanged under some mapping that is consistent with the system's assumptions. Group theory calls these mappings homomorphisms, and when they are 1:1 and cover all the elements under consideration, they are isomorphisms or permutations.

Take, for instance, the set ø, A, B, α, β  } under addition defined by:

ø + (any element) = the same element
A + A = B
A + B = α
B + B = β  
A + α β
B + β = A
α + α = A
α + β = B
β + β = α
A + β = ø 
B + α = ø 

The first rule gives the identity element. The last two give inverses. This set of elements is isomorphic to Z5, the set of integers mod 5.  We can show this by a bijective function that preserves identity and inverses (an isomorphism):

{ (ø, 5), (A, 1), (B, 2), (α, 3), (β, 4) }
We can see the symmetry about 0 and the inverses more directly by restating the domain set and the isomorphism as: 
{ (ø, 0), (A, 1), (B, 2), (α, -1), (β, -2) }
A group like Z5 has a unique identity element. An identity element remains fixed under all group isomorphisms. Whatever the identity is in the input it must  map to the identity element in the output. Five is the identity element under addition mod 5 since 0=5 mod 5.

Addition also remains well-defined: inverses and the 'order' of each element (in effect, the way skip-counting works) must be preserved:
{ (ø, 0), (A, -2), (B, 1), (α, 2), (β, -1) }

The structure of the group is maintained across the transformation:
  • ø goes to 0, identity to identity. 0 is its own inverse, and 0 has order 1 (one element results from skip counting 0+0). 
  • A goes to -2 and its inverse α goes to 2, the inverse of -2. Similarly for  B and β.
  • By skip counting  2, 2+2=4, +2=6=1 mod 5, +2=3 mod 5,  +2=5=0 mod 5 we get that the order of 2 (and -2) is 5. Similarly for B and β.
Had this been a slightly different type of group, say, Z6, there would be elements 2 and 3 which, by being factors of 6, are factors of 0 in that group; such elements give rise to sub-structures. The element 2 skip counts to { 0, 2, 4 } and 3 to { 0, 3 }, both of which are a fraction of the complete set Z6 ; neither subset includes 1 or 5, which fall out of the orbits of 2 and 3.

And here is the point: when we grow software, like it or not, we are implicitly defining a kind of broken algebra. Usually it is a messy affair, both incomplete and inconsistent, replete with exceptions. Yet despite the incongruities identifiable structure exists.

Fixed points in software give rise to symmetries analogous to those found in algebras - similarity and congruence in subordinate structures, inverses and zero divisors. If we fix two diagonally opposing corners of a square figure, we define an axis of symmetry. By fixing points we constrain the set of morphisms on the space defined by the square, leaving just one valid movement: a flip about the diagonal.

In a software code base, fixed points act to constrain, inhibit or completely proscribe adaptations. Like local fault inclusions formed in a crystal lattice by too-fast growth or foreign particles, fixed points introduce cleavage planes into the structure. If the system is well-formed the structure may be minimally complex. 

But fixed points in software are often incidental and accidental, leading to anomalous structuring. In a software system,  every random fixed point injected renders it more prone weird structuring. Whether we get a well-structured system with clean cleavage planes, a quizzical Rube Goldberg contraption, or an amorphous mass largely depends upon the environment in which the software is grown and the rate at which decisions crystalize.

Wednesday, May 8, 2013

History doesn't repeat itself...

A contemplation upon a disturbing waking dream, recalling the passing of two twins who were friends.

Quoth George Santayana,
Those who do not remember the past are condemned to repeat it.
But George also said,
History is a pack of lies about events that never happened told by people who weren't there.
Both comprise a subtle truth, for history never truly repeats itself, it just exhibits emergence of self-similar event patterns across time and space.

History is fractal.

Monday, April 22, 2013

Change the default SSH port on OSX

I got some pushback from sys admins about keeping an SSH server open on the default privileged port 22. They said "change it," so here's what I did.

First, I edited /etc/services, and changed the ssh entries to use a new port number. Chose a port above 1000 that isn't in use already. 

Second - and this is useful for people using git and ssh outbound - edit /etc/ssh_config and under Host *, add an entry:
  Port 22

Another method uses /System/Library/LaunchDaemons/ssh.plist, but the above is more Unix-centric way. On my system, the ssh.plist has a "disabled" key anyway. 

Wednesday, April 17, 2013

Fixed Points in a simple script

I had a simple task of taking a database extraction from some Oracle financial database view into  MySQL database tables.

Simple, right?

We're using Perl, which has flexible built-in data types to handle this sort of thing. It aught to be a clean mapping:

my $orahandle = $oracledb->prepare("SELECT * FROM $table") or die('Query failed on '.$table);
while( $data = $orahandle->fetchrow_hashref() ){
  if ( ! $inserthandle ) {
    @columns = map(lc, keys(%$data));
    $paramstr =  ('?,' x ((scalar (@columns))-1)).'?';
    $inserthandle = $targetdb->prepare("INSERT INTO $mysqltable (".join(',',@columns).") VALUES ( $paramstr )") or die "Problem can not prepare:  @{[$targetdb->errstr()]}\n";
  $inserthandle->execute( values(%$data) ) or die "Error: Failed to insert. @{[$orahandle->errstr()]}\n";  

The two tables are isomorphic so the problem scenario is one of a map. Theoretically, we do not need to know the names of the fields, or their data types, because the types are isomorphic too, more or less. The point is that discrete conversion is not demanded by the semantics of the problem scenario.

Having a rule, an isomorphic mapping if you will, is important even when "more or less" means there are exceptions. Treating code as a morphism facilitates articulation of the exceptions differentially, with respect to how they change the mapping, rather than obscuring all in undifferentiated procedural code. 

The problem in this specific case is that there are several DATE data type fields in the table, and the default format for Oracle is 'MM/DD/YYYY'.  MySQL has a fixed constraint upon the format of inserted dates, the ISO 'YYYY-MM-DD' format. It is a fixed point.

Programmers are taught to just hack it. List the field names individually in the SELECT and INSERT (more fixed points). Manipulate each DATE field in a discrete variable and substitute the transformed value through in-line code (even moar fixed points). It works, at the cost of tossing out the inherent symmetry in the scenario. And it adds several gratuitous fixed points - at least a couple for each discretely mangled field.

Now, one might think to coerce Oracle put the format back to:

alter session set NLS_DATE_FORMAT = 'YYYY-MM-DD'

But the upstream supplier of the view decided to make all the dates VARCHAR2's, so none of the normal date formatting is applied. Yet another fixed point. MySQL's fixed point was, at least, a standard. This fixed point is completely gratuitous, and is the real reason the approach breaks: Oracle normally uses the ISO YYYY-MM-DD format by default. My solution is to define functional data-scrubbing callbacks, to inject data manipulation into the otherwise symmetric process.

A footnote: Not all fixed points are undesirable. A couple of the views thrown obliquely over the wall at me don't even have keys. No primary keys. No keys at all. Keys define the symmetries and structure of a relational data set. Without them a view presents no relations, being merely an undifferentiated amorphous mass of records. Such a practice is to IT design, as mud-pies are to fine cuisine.

[edit: Cleaned up code examples. Don't know why Blogger's editor keeps doing this, but it keeps chopping up my markup. ]

Sunday, March 3, 2013

VIM Plugins

I use MacVIM with Austin Taylor's VIM configuration tilde, and recently replaced it with Yehuda Katz and Carl Lerche's Janus VIM configuration on my MacBook. Both add plenty of bells and whistles plugins: keyboard mappings, commands, syntax highlighting, etc. The two motivating factors is that (a) Janus is documented whereas tilde is not and (b) tilde does a few things that mismatch my preferred defaults, such as setting very magic regex mode and altering the meaning of keys that normally do navigation.

The big brick wall that I hit with any of these configurations is that unless you are a vim plugin writer,  the morass of code is an incomprehensibly ugly mess. Note: I can read the code, but VIM script is usually so ugly that the skin on my eyeballs starts peeling from gazing upon it.

The size of these VIM config bundles exacerbates the obfuscation. There are simple means of showing the key mappings in vim. (Incidentally, this wikia is a good read for learning about key mappings in vim.) Type map (or imap/nmap/vmap for insert mode/command mode/visual mode mappings):


...and you'll get a list of the key mappings. Now, with proper grouping a few dozen mappings can be readily accessible to the working memory. It is not unusual for vim configs to have upward of hundreds of key mappings. When you type :map all those mappings get dumped to a pager in the window, without any meaningful grouping. Not so nice for reading. This bloat is perhaps the best reason of all not to simply reuse someone else's VIM config bundle.

Try verbose:

:verbose map

This shows the file responsible for last setting a given mapping. This is nice for debugging why your expected mappings fail miserably and for identifying which mappings belong to a particular plugin (frequently undocumented).

Another problem with VIM mappings is that they are very often convoluted but only very rarely commented. Carlhuda's VIM bundle is an exception. Is a user really expected to read a mapping like this in a listing meant as a quick reference?

ai * :<C-u>cal <Sid>HandleTextObjectMapping(0, 0, 0, [line("."), line("."), col("."), col(".")])<CR>
What is the intention of the mapping? Idunno. Something about text indentation? Idunno. Should I even care? Idunno. The source file for this mapping had a few sparse comments. None of them were especially meaningful, but then again comments in vim mappings don't help the :map output. Ideally, the map command would allow a descriptive label to be attached, but that's not the way it works.

Sunday, February 24, 2013

Pet Pensiveness

We had to euthanize a family pet yesterday, a 55 pound labrador mix we named Molly. Molly had a great many qualities one looks for in a dog: she was submissive, rarely barked except when asked to or when she thought she was defending us. She was as attached to us as we were to her.

When we first found her, she was said to be the last of her litter - a runt no one else chose. We were told her other parent may have been a shepherd mix, but whatever her heritage she bore an uncanny resemblance to an oversized Finnish Spitz.

Molly had a sort of melancholy disposition from the start, as if she had felt abandoned. As she grew up we found her to be calm, obedient, and sociable on one hand, but when she switched it on her playfulness was almost wolf-like.

My waking moments were filled with memories, and one simple thought. So many people try to focus out on vanishingly distant, imaginary end-points, seeking rigid modes of thinking and to maintain narrow perpectives, racing so hard and fast yet giving so little thought to where they are going - under these conditions the lateral realities that pervade and indeed ground our lives become a blur that is easier to ignore.

Monday, February 4, 2013

GIT-ified views

When I first discovered Distributed Version Control Systems (DVCS), one of the few open source implementations available was Tom Lord's Arch, or tla. Arch was a direct predecessor to Bazaar, a spiritual parent to DARCS, and preceded GIT by about 4 years. The ground being plowed up by these projects, Linus re-envisioned the whole-changeset implementation and added a core audience in the form of the Linux Kernel project. It also didn't hurt that GIT's speed and space efficiency was better.

Coming out of an environment that used Rational's ClearCase, I found it useful to emulate the "set view" functionality using carefully constructed shell functions that could set up or reuse working trees on demand, separate from the repositories. A child login shell was launched to establish a repo-specific environment, and the environment curried to include tla command wrappers with defaulted common command arguments (tla was excessively verbose). I called the resulting wrapper Setview, after the ClearCase command.

The ClearCase feature provided a measure of process- and filesystem- isolation to the development cycle. This was probably deemed to be important to lawyers, who wanted to micromanage what every person could see. Never mind that they probably relied upon tons of GNU copylefted code, but I digress. There are still times when, instead of a stash or a branch/commit/checkout/branch, or cloning a repo a second time, I'd like to just have a transient, isolated instance of the working tree.

And I have to ask myself, "why?"

A decoupled working tree can more easily be automatically garbage collected. When I create temporary and intermediate files in such an ephemeral tree, I don't have to be concerned with clutter: it will go away on its own when I discard the view.

If the repo is decoupled from the working tree, that is, not in a .git subdirectory, the working tree can be destroyed accidentally without risk of destroying the repo. This is mitigated by a frequent cycle of pushing to a remote.

Switching working contexts is something every software developer has to do, especially when working in an institution with multiple internal clients. A pre-configured shell environment, launched when starting a task, and cleaned up when the task is finished, is a useful device for establishing context. It gives a sense of space, boundaries and structure, which is an illusion but helps keep work and thought processes organized.

Having common commands and options curried into wrapper functions is helpful. I find git to be less verbose than tla, but the inconsistency and whacked out grammar in the git command line interface is still a challenger to tla for the most obfuscated command line contest. Still, having aliases, functions, and some environmental variables set per view allows tools to be configured based on the task at hand rather than globally or per repo. An extremely simple example, but very pleasant, is to be able to go snooping around other directories and then just lcd (local cd) to somewhere in the $VIEWHOME. I know pushd/popd offers similar functionality, but it isn't quite the same and lcd is shorter.

For whatever its benefits, having wrappers for git seems a necessity, be they scripts, aliases, functions, or whatever. Having wrappers coupled to the working context, and not coupled to the repo or the users' login environment, is useful for ensuring good isolation and control over configurations.

I don't know if those are good enough reasons for re-implementing a more robust version of setview for git, but sooner or later it seems like I must.

Sunday, February 3, 2013

A note about meaningful gestures

A family member and her husband stopped by to visit briefly for a few hours while travelling for a vacation. They treated us to lunch, during which I asked how her kids were doing. She said "Thanks for asking," with some delight. My wife observed/opined later on in the day that this was very appreciated.

That caught my attention, and made me reflective. I mean, more reflective than usual. For a variety of reasons I have lost touch with a lot of people over my life. But it is more poignant when they were close relationships.

Not long ago told a Meetup group co-organizer I was considered leaving the group. It was nothing personal - it just seemed like time to move on and I felt spent. She remarked that in her opinion it was a bad idea, and that I may not understand the value of the network I had there. My reaction was that, this may be true - I may not fully appreciate what I do have in that network - but at the same time so very often it feels (for reasons that presently escape me) that I cannot effectively leverage that network in a meaningful way. I am "there" but not moving with the flow, like a phantom. I can observe but I'm hardly able to respond in real time or interact in a way that causes substantive growth personally or in the relationships.

I have felt the same way about many of my family relations for most of my life. It is pretty painful, actually.

Saturday, February 2, 2013

Cultured Software, Part III

Over the last two posts I obliquely drove from venting about PHP to discussing software as virtual fecal matter, went on to compare refactoring to composting, and finally introduced what I think of as Fixed Points.

This post is to expand upon the idea of a fixed point, which is a little more formal than a metaphor and a lot less rigorous than its algebraic namesake.

It is often helpful when generalizing to enumerate cases and examine concrete examples, and to think inductively.  Make a list. What might be some fixed points? I'm working with database and Web applications right now, and these surface details spring to mind:
  • Database connection details entrained in code: user names, passwords, literal database names, and such. 
  • Links between files, as by an inclusion construct, addressed via filesystem paths.
  • Queries that are substantially or exactly repeated (unDRY).
  • Explicit non-reflection.
    References to 
    properties, columns, fields... any explicit (re-)enumeration of the element parts of an abstract whole. The challenge here is that when there is a high degree of correspondence between structures, referring to  discrete elements may be  simpler, clearer, and more efficient to express than referring to the whole.
  • Gratuitous Typing.
    Types assigned to a resource a
    re fixed points even if they are not repeated.
    For instance, if we assume all image resources are in JPEG format, we constrain the set of mime types, conventional file names, suitable rendering mediums, and salient libraries. Or when we assign a precise length to a data field when no particular length is demanded by the problem. I refer to such assignment of types as "gratuitous," not because there isn't some utility -- these are sensible techniques for ensuring performance criteria and reducing the decision space -- but because often languages which encourage static typing are designed so as to force the choice of assigning types (even if they are weak types) and have few or no facilities for laziness in evaluation or for adaptive interfaces.
  • Positionally addressed fielded data (falls into the category of gratuitous typing). 
  • Code modules included from an external authoritative source may be a fixed point, assuming that there will always be some reason to incorporate updates made to the external module. 
  • Cut-and-pasted 
    • content
    • algorithms
    • code
Assumptions and inexpressive syntaxes give rise to fixed points. I'm distinguishing here between meaningful symmetry points, and gratuitous fixed points that break symmetry. Facts that aren't necessary to be explicitly represented in code but are there anyway, are fixed points.

It could be freakishly hard to understand code if it was all written in Ruby metacode style, or used overly generic labels for every data structure or function. Yet overly literal styles of coding necessarily over-constrain problems while under-expressing the solution intent.  Over reliance upon metaprogramming can be hyperoptic, but being too concrete is myopic: the stakeholder is presented with evolutionary dead-ends, the developer with death by a thousand asymmetric cuts.

Fixed points present constraints upon the contexts for which the intents expressed in the code are valid. 

Fixed points may also be false invariants. What matters is not the sensitivity to change, although fixed points seem to be facts and conditions that change slowly or at poorly understood intervals. What matters is that the assumptions underlying fixed points can change, which in turn adds rigidity and induces fractures in the structure of a codebase.