Monday, September 28, 2015

Emotion is the basis for knowing, not intellect

Society tries to teach us that intellectual pursuits will necessarily make you smarter - that "building your brain" around facts and problem solving tactics is the way to improve your understanding.

That's so wrong, it's a special kind of stupid.

Emotional capacity, not intellectual content, is the carrier wave of intelligence; emotional response is the method by which that carrier is modulated to encode intellectual content. Without the emotional carrier and modulation, intellectual content is indistinguishable from background noise or static.

That's why emotional events can be so disruptive to clear thinking, and why activities both intellectual and athletic that allow one to reach a flow state simultaneously makes one feel "in the groove".

Thursday, September 17, 2015

Comprehensive Sense Making

Trying to comprehend a process consisting of multiple stakeholders in multiple workflows, is akin to a kid watching the guard-rails pass the car window on a long road trip.

When your reference frame is traveling at a different speed and direction than the frame of the thing you're observing, you may get 20-20 vision, but the whole thing is still going to be a blur.

Such is the problem of trusting the viewpoints of individual users reacting to day to day business activities: their view is blurred by the pace of life. And so is yours.

Want to align visions better? Reduce the number of players in the game at any one moment. Allow them to ramp up their speed by communicating exclusively together.  When business needs push them apart, allow them to disengage from the process instead of thrashing against it.

Recognize that until the game is fully underway, what the players see of each other will be blurred, and any tooling created to support the process will necessarily incorporate the blurred understanding.

Sunday, July 19, 2015

Set -o vi FTW!!!

A programming buddy recently reposted some tips on using the OSX Terminal, from Tech Republic.
My terse response was

set -o vi

Which, of course, calls for some explanation. This configuration option to a Posix shell like Bash, sets it up to use the behavior of the ancient and ever popular "VI" editor as a command-line terminal interface.

Transliterating most of the Tech Republic article, here are the roughly equivalent shortcuts:

Puts the terminal command line into "Command Mode".  Single letter commands from VI will now work, including searches.

I or i or a or A
Puts the terminal command line into "Insert Mode".
Uppercase I puts the cursor at the beginning of the line, lowercase i leaves the cursor where it is.
Uppercase A puts the cursor at the end of the line, lowercase a puts the cursor after the next character.
Hit Escape to leave insert mode.

w and b
Command Mode: These shortcuts allow you to move the cursor between separate words in a command line.
Use 'b' to move back and use 'w' to move forward down the line.

Command Mode: This shortcut moves the cursor to the end of the current word on the command line.

d b
Using this key sequence will delete the word immediately before, or to the left of, the cursor.

d b b p
This will the swap two words that appear immediately before the cursor.
So, if "this is" sits before the word the cursor is on, using d b b p will change that to "is this."

/somesuch RETURN
If you need to locate a previously used command in Terminal,
use /somesearch and hit the RETURN key.
It will perform a search on the command history and allow you to find a previously used command
that you may need to access again.
Hit 'n' to find the next match.

?somesuch RETURN
Same as /, except searches history in reverse order.

Using '$' will take you to the end, or the far right, of the line where your cursor is.

This shortcut is the opposite of '$'. Typing '0' will take you back to the beginning, or the far left,
of the line you are currently working on.

Control + C
If you need to kill what's currently running, us Control and C in Terminal to abort the current application.

d 0
This shortcut clears the entirety of the line before the cursor.
So, if you get to the end of a line and realize the whole thing is wrong, use 0 D to delete it all.

0 D
These shortcuts will clear the entire line.

Using D will have a similar effect as using 0 D, but will only clear the line that appears after the cursor.
It is helpful if you need to change or delete the latter half of a line.

Comand + Q
This will clear the entire Terminal by killing the underlying process.
Let's face it, you weren't getting anything useful done anyway.

Control + L
This will clear the entire Terminal screen you're working on, deleting everything.
The same thing can be accomplished by typing "clear" into Terminal.

Control + D
This will cause you to exit the current shell in Terminal.
The same thing can be accomplished by typing "exit" into Terminal.

Control + Z
Suspends what you are currently running in the foreground and makes it a sleeping background process.
Type 'bg' and hit Return to awaken what you were running as a background process.
Type 'fg' to awaken what you were running in the foreground.
Type 'kill %1' to send the kill signal to the first suspended job.

This executes the last command entered. If you run into permission issues, try entering sudo before !!.

Control + H
Essentially the same thing as backspace, but it is useful if you want to stay anchored on the home row keys.

Typing "top" into Terminal will display all of your active processes.
Similar to what you'd get from Activity Monitor, but within Terminal. Press "Q" to quit.

history + a number
If you've lost track of a command you typed earlier, you can type "history" into Terminal to retrieve a history of your commands. However, if you want to view a specific number of past commands simply type a space then a number after history. So, "history 5" would show you the last five commands you typed.

bind -p | grep -v '^#\|self-insert\|^$'
This shortcut shows the key bindings that are active in the current editing mode (Command mode and Insert mode have different bindings).

ls -ltr
Show an extended list of files, ordered by time of creation, in reverse order (most recently created last).

find . -name "*.php" -exec grep -l "somesuch" {} \;
Do a deep search for the string "somesuch" in all files ending with ".php", starting from the current "." directory.

Sunday, May 24, 2015

Has the Time Come for Software Cooperative CUs?

As we were departing php[Tek] 2015 last week, I asked my fellow attendees where they were heading. "London," replied Derick Rethans cheerfully, to which I replied with mock seriousness "Ah... that's a long drive." The look of confusion on his face told me that a crucial element was missing from the conversation - he didn't know me well enough to tell that I was joking. Open source is a lot like that. It can be difficult for small businesses in particular, to distinguish between technologies - and technologists - that presented a lasting opportunity, and those that had the potential to fall flat and kill a business model in the process. At php[Tek], as with most of the grass-roots conferences I've attended over the past decade, I recognized an emerging phenomena. It may not be so much a trend as a series of pieces falling into place in the economy and the community at large. Similar to what must have preceded the establishment of Credit Unions in the 1850's, there is an increasingly undercapitalized population relying ever more on applied software technologies. Open Source was a major component falling into place, not merely because it made certain software technologies cheap, but because it democratized access and learning about managing the technologies. Much in the same way, Credit Unions made it possible for individuals and small businesses in impoverished communities not just to self-finance but to learn to oversee and manage their own growth. Regardless of how accountants see software, creating and curating it is a primary factor in the success or failure of modern business operations. Yet even for those that are technologically skilled, the tangible and intangible capital costs can overwhelm the ability of any one business to maintain. Again, Open Source helps by lowering costs and making acquisition of skill levels feasible. Yet even Open Source can present too high a cost of adaptation and configuration management over time. What Open Source does not yet do, and seems to be about to do, is provide a way for neighbors in the technology community - providers and consumers - to secure the future of a software technology together. That is, I think formal cooperatives, or "Software Credit Unions," are about to emerge from the economic primordial soup we call the Market. Many if not most of the core technologies already have found homes in foundations, consortiums, non-profit charities, and public corporations. That's not what I'm pointing out. The applications of these technologies, which provide real value to business process stakeholders, are assets that are frequently constructed with non-trivial personal or business funding. A cooperative form of business would provide pooling of the capital investments, sharing of risks, and amortizing of maintenance costs and risks for members, as well as avoidance of "razing" of small intellectual properties when such small businesses close. If such organizations were formed under the same kinds of fiduciary ethics and practices as a credit union, it would be a boon to the future of technological small business clients and open source contributors alike. It would help define financial and legal standing for popular projects, help quantify the value of contributions, and support necessary but otherwise marginal projects for their members. A cooperative form can also help ensure that peer-review processes (already a part of open source culture) are enforced to members' quality standards and not to requirements of some ill-conceived third party. Software has already irreversibly infiltrated our lives. I don't think it is even possible that cooperatives akin to software credit unions can be avoided. It is happening now. The question is not "if" organizations that hold our software assets will exist. They do now. The question is whether any such organizations will hold a fiduciary role for consumers and producers alike as a membership organization in a credit union model, or leverage us all at a disadvantage like a bank.

Monday, May 4, 2015

Dangerous Domain Vocabularies

A very smart young colleague at work has been introducing us to concepts of CQRS, Event Sourcing, and Domain Driven Design (DDD). One of the more prominent values held by DDD is the pervasive vocabulary used by Domain Experts - ubiquitous language is considered to trump any other technical constructions.

As well it should, at least in established disciplines and where the business actually has resident expertise available. Even where inexperience is greater than the expertise, one still wants the professionals who are tasked with the responsibilities for the business outcomes to have a sense of ownership of the language.

But there are fallacies of belief that can be assumed as well:

  • Belief that a comprehensive domain language exists when it is not even a well formed vocabulary.
  • Belief that the semantics behind nascent language idioms are grounded, when they are cliches or abstractions derived from historical accidents (that is, legacy systems and environmental conditions that no longer exist). 
  • Belief that concepts of a domain model are part of some sort of mathematical reality that objectively exists above our own, unchanging and merely in need of discovery.
  • Belief that the ephemeral quality of language is not a significant factor when individuals move in and out of the domain.
  • Belief that the terrain of the problem space is substantially stable over the expected useful lifetime of the model.
  • Belief that domain experts' language never involves idiosyncratic forms that are self-inconsistent within a single bounded context.
It is the last bullet item that got me to thinking on this topic. A laboratory technician was describing to me a small dispute over a protocol in her lab one day. Two technicians were following two different procedures for diluting liquid samples. The terminology they had adopted was, for instance, to say that they were preparing a "one to three dilution".

It is more commonly expressed as a "dilution ratio of 1:3", and therein lies the problem. One tech said that means mixing one part of a solute to three parts of a solvent, and the other claimed (apparently consistent with the procedures used in the profession) that it means mixing one part solute to two parts solving and giving three parts of an admixture. The vocabulary, having been neglected and forgotten by many of the practitioners, is no longer clear.

Mathematically, the ratio "1:3" is like a fraction 1/3 and most people would think of "one part of something to three parts of something else" at the same moment in time. The lab professionals, meanwhile, have adopted an idiosyncratic interpretation, assigning "1" to the "something" and "3" to "one something plus two something elses" - that is, they compare a variable in a one step to a dependant value resulting from a subsequent step. 

Consider this:
Step 1: take 1 part salt
Step 2: take 2 parts pepper
Step 3: mix salt and pepper

The result of Step 3 is a salt-pepper admixture of roughly three volumes, or a 1:3 dilution of salt in pepper. But mathematically the 1 and the 3 are different units, 1 being a unit of salt and 3 being a unit of (1 salt + 3 pepper). The reality of the process is, further, that the entity described by the 3 doesn't exist until the entity described by the 1 and a derived value for an entity which is not made explicit are combined. That's like telling a cook how to bake a pie with an ingredients list that omits the filling and includes the whole finished pie itself. 

And don't think this is a trivial thing. People have no doubt died over the misunderstanding and confusion brought on by this one shitty little idiosyncrasy. 

Wednesday, November 12, 2014

Version Control and Codependent Relationships

In the midst of others' conversations at a dinner meetup recently, I asked one quiet young designer/developer sitting across from me where she was from.   She mentioned her home state, and that she worked for a certain marketing agency in that area. Furthermore, she offered with a hint of wistfulness, an ironic observation that they were a leading agency with a huge backlog, they were not up to speed with modern software development practices such as version control.

I also heard an undertone of fatalism, and frustration about how to communicate, and I'd heard that many times before. Particularly in Web Design as an art, modern software engineering practices have only begun to really take root and infiltrate as a professional practice. This is because only recently have Web Designers begun to recognize themselves as serious software professionals. 

That is not to say, that they weren't serious before. Or that they weren't software developers. It is just that in their daily practice, their brains had yet (and to some extent have yet) to converge on a common cultural recognition that they are professionals with a professional discipline.  

There are many reasons for this:
  • personal immaturity - a person is "just not there" yet, and may not see the value in reinvesting effort in skill building
  • "fire fighter" mentality - fire fighters don't have to be concerned about building structures, they just try to keep the flames at bay
  • stress - people who feel under the gun have much less presence of mind for reflection, self-improvement, or process improvement
  • management reactivity - this contributes to stress too and IMHO is the most important root cause in a small business environment

Management Reactivity

The young developer mentioned that the idea of version control prevalent in their office was to yell over a cubical wall and say "Hey, I'm going to edit FuBar.html, is anyone else editing it?".  Yet this isn't even a rudimentary version control such as copying to snapshot folders or renaming .bak files - it is just a verbal form of a semaphore. 

The reason for this immaturity of practice? Ostensibly, it is that they do not have the time to pick up a new practice and put it in place, while also getting the backlog worked on.  

The root cause is reactivity in management. I do not mean "knee jerk" reactions, although that is a visible sign of reactivity.  It may alternatively be that management is poorly trained and possibly even incompetent. By reactivity I mean any practice that undermines a continuous improvement process by constantly misaligning the goals and the actual values expressed to the team. 

Lumped together, you might just simply say it is bad management. Other signs:
  • the company does not allocate a sufficient amount of resources for continued professional skill building
  • calculated risk taking is discouraged; the level of proof required to bring in new techniques or technologies is set higher than the level of proof required to keep the existing known poor practices and technologies with persistent defects
  • supervisors are not actively contributing to work output, but are all mere overseers
  • heavy emphasis on documentation in planning, with little reference or use of those documents by the team performing the work
  • frequent use of the word "just," "only," or other hedging language that diminishes the cost/effort/time/importance/complexity/thinking required to move forward in a sensible direction
  • "Continuous improvement" is a cliche used often, but with no practical path of allowing developers to start moving down any path that changes the toolchain or tactics.  


Now, here's the thing: that developer is young and that developer is smart,  so that developer has the power to effect change. Period. And that should be the End of Discussion.

But it isn't the end of the discussion. That developer is also inexperienced and is fearful or at least risk averse, and it is the employer who has the money. There is a real power imbalance when the developer sees herself as the one who needs the money more than anyone else needs her skills. 

By postponing skill building, the developer puts herself in a position to be used reactively. 

By foregoing process and technology improvement - and suppressing the adoption of modern software practices - the employer keeps the developer in a co-dependent posture. 

The tactics the developer learns to deal with problems reactively are employer-specific, and thus much less non-transferrable. At best, they fail to make the developer more attractive to another potential employer.   The employer can pay a co-dependent developer less, because the developer lacks confidence and lacks opportunities. Modern practices, on the other hand, make the developer more attractive to competitors and helps equalize the balance of power. 

You get the idea. The sad thing is, co-dependence hurts all parties in a relationship. The employer will fall behind competitors, and so will the employee. 

Sunday, August 3, 2014

Namespaces in Ruby

Ruby is a very plastic language. By plastic, I don't mean "fake" but easily manipulated.   I was considering namespaces, as they are in PHP and a number of languages derived syntactically from C:

namespace \MyOrg\MyDomain\MyApp\MyPackage\Foo;

I was thinking of Ruby. In Ruby, there is no single namespace declaration; instead, the language provides a Module construct to more-or-less accomplish the same goals. The difficulty being that Module is rather more syntax than less.
Poking around Google, I came across this little gist in which Justin Herrick describes how he made a short DSL to have a nice brief Clojure-like syntax:

ns 'MyOrg.MyDomain.MyApp.MyPackage.Foo' do
   def fluggelduffel

Herrick's solution takes advantage of Ruby's seemingly limitless ability to modify the module environment. And it works, with one limitation: constants referenced in a method like fluggelduffel, or anywhere in the do block for that matter, throw a NameError unless const_set is used:

ns 'MyOrg.MyDomain.MyApp.MyPackage.Foo' do
   def fluggelduffel
      puts A

I played around with the code a bit to add an options hash:

ns 'MyOrg.MyDomain.MyApp.MyPackage.Foo', {  :constants=>{ :A=>"FUBAR" } } do
   def fluggelduffel
      puts A

The code simply calls const_set in a different place. The constant A is there in module Foo, but it isn't visible in the lexical scope in which puts is referencing A. We can address A explicitly via MyOrg::MyDomain::MyApp::MyPackage::Foo::A, but how ugly is that? We can also use const_get('A') but that is pretty ugly too.

The problem is that bare references to constants are resolved in the lexical scope in which the block was created. It has nothing to do with the scope the constant is defined in. What to do?

There isn't a lot that can be done. If you're using unqualified constants, that's pretty ugly in itself... polluting your code with global references and all. If you really need that (dis)ability, const_get('A') follows the nesting chain all the way up. I've found that self::A works fine for the globals I've defined locally using const_set, though I'm uncertain if there are any side-effects or weird interactions. In this way, constants can be defined dynamically, and attached to the initial namespace definition.