Thursday, December 17, 2009

Delusions of Adequacy

As I sit here sipping my Bruegger's coffee, young adults laughing it up and jostling the chairs behind me, I'm thinking of a small patch job I am nearly finished with on the site. More precisely, how frustrating it was to do the patch.

The kids move on. Things are quieter now. I'm listening to the Imogen Heap channel on Pandora. In another post I referred to the "light rust" on my technical skills. That's just a metaphor. The deeper feeling is that I've always been outside looking in, regardless of my skills. Since technical leadership was never the fundamental cause of that issue, I don't consider it the solution either. In my very first paying job, I didn't even know the programming languages or have any genuine experience working with databases... but I spent a lot of time understanding the requirements and putting in place scripted procedures to ensure that assumptions were verified and the rationales for decisions were documented... and I learned to use source control on everything, including the database schemata. These are things I still see programmers failing to do, even in what they consider "best practice".

Regina Spektor screeches out a tune about blue lips on Pandora. She's the weird one with the cool tune on the Apple commercial. The process we follow has a deep effect on what we can accomplish. Like a lever, a good process can extend the span over which useful work can be performed, and a bad process will naturally act to reduce your effective span. As a 4GL developer, I had a very effective process for producing solid applications that not only did not crash, but addressed the concerns of many different stakeholders. I don't feel like I've had that sense of effectiveness in development using the Joomla code base. I am managing the problems of patching code bases, but not crunching out large chunks of solid code like before.

It isn't a good feeling, like trying to program while wearing mittens. Imogen Heap asks now "Where are we? What the hell is going on?" on Pandora. It is a good question. My code works, but does it fit into a larger scheme of things? Where is the migration path? The latter problem is as much or more a broader problem with modern business model of Web development, fractured as it is by cut-rate competition and the tendency of the business community to respect corruption more than honest assessments. There will be no maturing of the discipline when the discipline itself is just a subterfuge. I suppose that's why I'm ambivalent about the Agile rhetoric: on the one hand, it appeals to my objectivist tendencies; on the other hand, as with any practice it is no better than the practitioners, many of whom seem to be just as rationalizing as the best of the Big Methodology gurus of previous generations.

Maybe it is my scruples again. Even if you get something built effectively, how ethical is it to construct a building that a customer cannot afford to maintain? How ethical is it to build the building without bringing up the real and potential support costs? Many of the software systems built today are like Bucky Fuller's 1960's tensegrity domes: really spiffy to look at, but making additions later on is always a engineering feat with bespoke parts rather than a construction project using commodity materials. Does that small business realize that, should they ever want to extend that nifty Web app you built at the customer's expense, you will hold your own possession of the code and your in-house hosting of the running system as a bargaining chip? Does it make any sense at all for a customer to pay to put the on-going hosting of mission-critical enterprise applications in the hands of a software development company, without which the company itself could fail? To me, that's like an otherwise healthy human putting themselves on life-support machinery.

What is practical? Well, what I'm hearing from all sides is that Ruby On Rails and Mac OSX just work. eXtreme Programming is a working software team practice. Scrum seems to kind-of-sort-of help people in big institutions hedge against some of the pathologies intrinsic to that environment. Providing stateless ("REST") Web Service programming interfaces is a good thing. Yet architecturally, the World Wide Web is such a rats nest of ad-hoc-ery -- even at the so-called "Standards" level -- that people really need to go the extra mile to separate their actual business concerns from the misconceptions and false-premises of the medium.

Don't be deluded; few investments into Web technologies are ever truly adequate. Most of the Web has been engineered for obsolescence.

Sunday, December 13, 2009

Idea for SplotchUp!

I've been searching for a reason to begin working on the Splotch Up site. Many of the people at the Web Design Meetup suggested concepts and thoughts on usability or touched on the means to the end -- mainly in the model of socially edited networks -- but I lacked the compelling reason why anyone would choose to put the effort out or visit the site. A recent Infonomics piece suggests a reason: as a legal defense.

As it happens, a company which cannot demonstrate that it knows its own resources also cannot provide an effective defense against claims of neglect, malfeasance, legally inadmissible documentation, etc.

To quote Hugh Laurie as fictional character Dr. House, "Everybody lies." Knowing your own process is in general a good thing, but it is dreadfully difficult to define business processes with precision and extremely resource intensive to do so with any reasonable accuracy over time. I learned this lesson acutely through my years developing enterprise quality systems, particularly change management tools to support ISO9001 implementations. What I found was the broader business analog to what Dr. David Parnas (an early advocate of the concept of encapsulation in modular programming) wrote about in his snarkily titled paper, "A Rational Design Process: How and Why To Fake It". Our institutions are us, and to the extent that we have a tendency to make stuff up, so do our institutions.

Legally, documentation is used as a form of forensic evidence. Just as DNA can degrade over time so do other forms of forensic evidence. That includes documentation. Like many engineers of his time, Parnas was a strong advocate of precise documentation; the problem is that the more you invest in documentation which does not participate fully in the enacted process, the less you have to enact the process, and the greater the real depreciation will be in the documentation asset. When was the last time your accounting rolls listed the value of your documentation or the amount they depreciated over the past year? Probably never. So right there you have an obvious contradiction between ideological business doctrine and its practice.

Weinberg described the ethical conditions, particularly incongruities between the values of employees and the values reflected by the institutional practices. The lack of congruence led to what he termed cognitive dissonance. The static picture offered by ISO9000 motivated approaches to documentation are bad to the extent that it cannot effectively document what is necessarily a set of dynamically interleaving, fluid processes. The typical solution is to reduce the precision, introduce weasel-wording and hedging language, and basically make process documentation read like a tome of Nostradamus: easily interpreted to mean anything the reader wishes to infer. That is also known as plausible deniability.

Your garden-variety Big Corporation process documentation is just a means of covering up one's legal backside, and is only secondarily a business or engineering artifact. That's the chief fallacy of standards like ISO9000 and CMM. They are meant to provide a framework for cover for ethical incongruences -- that is, they are lie management platforms. That may be why such companies come up against such high levels of dissonance when attempting to incorporate XP or other Agile methods.

Your garden-variety Small Company on the other hand, cannot afford big up front documentation, but lack of some form of resource identification is an absolute roadblock to scaling up and dealing with situations rationally. Whether you're small or big, it is generally your people acting heroically to save your skin when things go wrong. When properly utilized however, documentation is an intrinsic part of the enacted system, not a waste product produced as a side-effect of the process. Or rather, it is both a side-effect of the process as well as a product that re-enters the food chain in the next cycle. Enabling that is what Splotch Up is about.

Tuesday, December 8, 2009

Another pain point

Another thing that I find really, truly, persistently, and totally pissing me off is all the problems with performance and stability on my Windows XP laptop. I've gotten to the point that I never want to use anything labeled with the Lenovo or Microsoft brands ever again. Every time the system takes forever and a day to respond to a mouse click, or stops responding after "recommended updates" were applied, I have to fight the urge to toss the laptop keyboard over screen off the desk.

I have to admit that I personally stress the system a bit more than most, but probably not half as much as do many other developers. I don't know why but Windows is still unstable after all these years, and I trust Microsoft implicitly to get it wrong again with Windows 7. Check your Google and compare "OSX Crashing" vs "Windows 7 Crashing". Yeah, even OSX crashes, but not as routinely. My experience with UNIX style systems says that the crashes won't increase exponentially over time as they seem to have done on every new Windows system I've ever owned.

Why should anyone have to put up with defective software? Whether I'm working or recreating, my patience is too valuable a thing to waste on Windows.

A possible solution: a laptop OS that runs exclusively as a virtual machine, the way VMWare or XEN work. Nothing ever really gets installed on the base OS, but instead is inserted into its own virtualization compartment. Compartments could be federated under this method, so that I could chose how devices and apps are bundled to make a runnable session, and if one compartment causes a session to crash that compartment could be isolated.

Doing Something Worthy of Doing

I'm constantly pulled back to this train of thought: of the options that are sometimes placed before me, and the few that I had enough understanding of to consider pursuing, are they worthy of my attention?

What, seriously, has come up in the past several months, and why am I not on track pursuing one of them? Instead I continue to waiver, unsure of myself, not unwilling to commit to the right cause but never less sure of the right causes to which to commit.

Scientific coding. I had a small job to port an algorithm from Maple to C++. However, it was in the spring and I was not able to complete the task while taking care of finals and graduation. Mostly I had roadblocking questions about the Maple code and lost touch with the professor for too long. I let the prof. down and now consider it a missed opportunity.

Web business. Some friends have talked about this from time to time. Various ideas from a system to order printed materials; course survey system for small colleges; dog kennel club store; a "heartbeat" notification site that triggers actions when someone stops visiting; a Web 2.0 design tool using Inkscape; SplotchUp, whatever social networking that was supposed to be; virtual flyerwalls... a lot of curious thoughts without a reasonable idea for return. People say you need to take risks but that implies you have some means of judging risk to begin with, and I could never manage to put a confident story together that would say "go ahead with it".

One thing I know, is that altruism is destroying me. I feel unable to withhold, yet also unable to leverage the kinds of benefits I see others gaining from their efforts. Even with the open source projects like Atramentum, the real benefit is building connections to people -- users and collaborators -- and *pfft* the few bites I got there evaporated. And in business, where the "hook" is everything, I find myself unable to present myself with strings attached or to be assertive enough to compete effectively for business.

Even still, people distrust me and go with those obviously intent on doing them wrong. I'm the "nice guy" at whom everyone smiles, yet who is shunned in every circle. It has been this way since I can remember, the sense of isolation ever clouding my thoughts. Sometimes that sense of marginalization becomes acutely painful, the mental equivalent of when I was told to work in a freezing pool of water, my muscles shaking so violently I barely had the coordination to crawl out. I still don't know where I fit in this world.

When almost nobody is interested in what you have to say or can do, where is the sense in doing anything at all? Intimacy is among the most basic of human needs, and when it remains unfulfilled we starve just as surely as if we were going without food.

In the end it all sums up to nothing, just so much shouting into the wind.

Sunday, December 6, 2009

Pain points

They say that when you're an entrepreneur, you look for a pain point and do something to reduce the pain. That gives you a narrative, a reason for being, The Story for why people should care. So I ask myself what are my pain points?

Based on my previous post, that might be obvious. I'm not giving up all forms of social networking, but I'm also not going to keep dumping effort into fighting a battle I cannot win. I'm an INTJ; my strength is in systemic thinking, and I gravitate toward understanding the nature of things -- these are not characteristics of business networkers driven by hope of monetary gain. My pain point here is that I see a huge amount of effort put into manipulating people, and very little into the advancement of the human condition.

Now, I'm not writing that there isn't pure research going on in the academic world, or that entrepreneurs aught to be doing research or aught not to be seeking profit. Really, a lot of what is out there on the 'net is just entertainment which does nothing to establish the conditions necessary for sustained healthy growth, lowering of real costs of energy, or any deepening of knowledge. One good thing I see is that some knowledge -- with a lot of noise -- is more broadly disseminated due to the Web.

That general pattern, Web as Cross-Polinator, has been with us from before the inception of the Web, by way of the DARPA internet. It would be an overstatement to claim it is THE fundamental pattern of Web solutions from which all others derive, but not by much. So perhaps my "pain point" can be restated in a way that fits with this pattern. We are destroying our very futures by putting effort and money into lifestyles that don't lead to lasting benefit for future generations. People playing on Second Life, Facebook Games, and other superficial social networking; building software as services to solve trivial tactical issues while entangling companies in intractably expensive integration scenarios; and individuals pissing away their personal assets on consumer goods that raise more problems than they solve. There is always an aspect of utility to each of these areas, just as NASA's long history of failures inevitably spawns important data (by design) and interesting technologies (by side-effect). But the ratios of cost:benefit and of pleasurable-short-term-benefit:intangible-lasting-benefit are completely lopsided.

We also allow "long term" to be misanthropically applied by power-hungry radicals with anachronistic political agendas. I've been guilty of using that speech pattern myself in software. I initially adopted that language from studying the make-believe Quality programs of the '90s, which were based on cold-war politics and Big Methodology mentality. Long term or systemic thinking neither implies nor requires greater centralization. And proper evaluation of the relative worth of a technology or practice is made all that much more difficult when a few institutions dominate many people. Better to spread the value judgments around and use randomness to cancel out the effects of corruption and poor integrity, than to concentrate them in one place with strong personal and institutional biases dominating the process.

R.B.Fuller posited that we should be seeking systematic solutions to problems through apolitical design. Yet even his legacy appears on the face to be corrupted by people who use monetary awards to marginalize those who do not share a left-of-center political agenda. Reward mechanisms are inevitably driven by ideologies. How do we make rational, objective valuation judgments without allowing personal agendas to overtake and game the economic system or even just a social Web system set up to fund truly beneficial projects? How is "truly beneficial" defined in practice? This is our national pain point, and one which I realize has deeply affected my own job search in recent years.

Thursday, December 3, 2009

Is Social Networking a Fraud?

I never was particularly good at navigating human social networks, so what makes all you social networking lemmings think that someone like me is going to do any better just because I'm on Facebook or LinkedIn? To paraphrase R.B.Fuller, the most important thing about me is that I'm an average, ordinary human being. So even to my friends, I'm virtually invisible.

There is a vanishingly small probability that any one of you social types will ever see my posts, let alone bother to read through them, let alone be able to make out the words with more than two syllables. I can't help it that the working memories of social networkers are limited to fewer than 60 characters... most of us ordinary humans still think in terms of narratives, like the "sentences" and "paragraphs" we learned when pencils were still made from trees.

As an experiment, try placing a "salt" text somewhere on your Facebook profile. I used "A TREE FALLS IN THE FOREST", and added an instruction to anyone who saw it to post it to my wall. In the whole year, one person took the bait. So for all intents and purposes, the time I put into filling in information on Facebook is wasted. Ditto for this blog. I'm invisible. Virtually no one will read it. On the other hand, Facebook gets real monetary value from me filling out all that demographic information for them, and I get nothing out of it myself.

So here's my thesis: social networking tools are not additive but multiplicative. If you're already skilled at the art of making friends and holding together social relationships, social networking will multiply your reach. If, like me, you can be absent from an event and have everyone swear you were there; you speak loudly yet people right in front of you consistently fail to register your voice; or you remember most acquaintances with clarity but they haven't the foggiest idea of who you are; then your multiplier is somewhere between 0 and 1 exclusive. The fact that everyone knows your profile sits out there, comfortably ignorable, renders us even more invisible.

As a corollary to this hypothesis, I conjecture that were I not even using a social networking tool, I would have at least stuck out by virtue of my absence. This is based on my long experience having my speech ignored: I learned that lowering my volume was much more effective at gaining the attention of those around me, than was raising it. I remember family telling me that my problem was I spoke too softly; in fact it was just such a feint which gave them this impression.

The real bottom line is this: after a couple of years of use, I'm not feeling, seeing, or obtaining any material benefit from using social networking services. I see no particular reason why I should expect anything different by continuing to do the same thing. So the natural conclusion is that I should stop, and divert my attention elsewhere.

The Fallacy of Social Media

The fallacy of social media is that it can turn a non-social personality into a well-connected socialite. Take a close look at the people who are able to leverage Twitter, LinkedIn, Facebook, etc and those who use these tools a LOT but never quite make the cut. I think what you'll find is that the social networking tools can have a multiplying effect for those with strong external networking skills, but are just a time sucking device for those who are not otherwise socially inclined.

Over the years that I've been on Facebook and Blogger, many have seen and ignored my Facebook posts and few have bothered to read my Blogger entries even when posted to Facebook. Why should they? For one thing, my interests are arcane and obscure. For another, other than occasional blowups they mostly are not excited tirades. To quote RB Fuller, the most special thing about me is that I'm an average, ordinary human being. I'm just not worth knowing. The sad truth is, the people we do pay attention to aren't really any more worth knowing either, it's just because of cosmetic factors or conceptions of affiliation which are largely unfounded.

But who really benefits from us socially inept types? As an experiment, I posted the salt phrase "A TREE FALLS IN THE FOREST" to my personal Facebook profile, and asked anyone who saw it to post the phrase to my wall. Over the following months, no one noticed. Over a year, one old friend noticed and took the bait. After that, no one else. So my profile is essentially unnoticed by 99% of my own network. Why bother filling it out? The only real beneficiary is Facebook itself, which is now valued in the billions. I'm certainly not against a company making money, but it was my time, my effort, and my demographic data that they are using to do it, and I'm not getting a material benefit from it.

My advice? Blog if you will. I do it for self therapy, not because I expect a following. I fully understand that I have been and probably always will be the proverbial tree falling in the forest. Don't invest your time and demographics in Facebook unless you can reasonably expect to get measurable, material results in exchange for your disclosures. Got someone to talk to already? Then go ahead and tweet, post, and text them. But don't make the mistake that these tools will substantially change the number of genuine friends you make. If people don't already naturally gravitate toward you in real life, real people won't gravitate to you on Facebook or Twitter either. You can chose to pretend (in which case you're selling your soul for a false sense of intimacy) or become embittered (in which case you're just ruining the experience for everyone else), but perhaps the best thing to do is to allow the kids to enjoy their promiscuous social networking and focus on more rewarding things in life.

OS/2 Presentation Manager: A Superior Desktop?

I'm increasingly frustrated by the erratic performance of my Windows machines. The software is so extremely kludged, so inconsistent in implementation, and so ridiculously bloated with unnecessary features. As a patched together DOS-based ripoff of the Macintosh GUI, Windows took the market because (a) it ran on PC clones (b) it did not require the level of administration that UNIX (ne Linux) workstations imposed and (c) the WIMPy interface meant that untrained, unskilled, and illiterate people could think of themselves as computer experts and could do some practical stuff while they otherwise were wasting their time. How the tables have turned.

Now, the desktop most widely recognized as superior by both Liberal Arts majors and techies alike is a UNIX-based system: Mac OSX. As an acquaintance in the Ruby community told me bluntly, he regarded a potential employer as completely unsuitable after they informed him of their requirement that developers only use Windows machines. Apparently they "had a contract with the vendor". I don't know him well, but his reputation is of an objective, rational thinker prone to taking measurements of his own productivity. It isn't just an artsy thing: Mac OSX does seem to let people get more done.

OpenStep is the Solaris cousin to the NeXTSTEP desktop now sported by OSX. My memory is that OpenStep developed in conjunction with the Corba Object Request Broker technology, and later forked from it. Another desktop with this property was OS/2's Presentation Manager, or more particularly the Workplace Shell.

Now, here's where the title comes in: any of IBM's languages with Corba (SOM) bindings could extend any of the desktop primitives. REXX had such a binding, and I found it trivial to inherit new folder types that automatically created sub-folder structures with standarized instances of files, and implemented type-specific menu methods. While you can monkey around with Windows desktop hooks to fudge up the menus, it really is not the same thing. On OS/2's workplace shell you could glue together entire applications by inheriting and extending the desktop objects. When you did it, it worked like you expected it. And you could do it all just by writing a bit of built-in scripting. Very nice indeed.

My experience at the time was that from a productivity standpoint the OS/2 solution was far superior to the Windows NT user interface. But where OS/2 was the overweight alternative at the time, now that title clearly belongs to Windows 7. That's an irony, because the UNIX/OpenStep workstation represented by Mac OSX is a kind of an intellectual successor to the OS/2 workplace shell (if not strictly a successor in terms of code).

Monday, November 30, 2009

Problem with Bug Trackers

I once worked on all manner of quality record systems. Anything from enterprise level ISO9000 incident management systems to team-level bug tracking tools.

The apps were often built under the premise that they "own" the data, so requirements were architected to make them one-size-fits-all solutions with little (or no) attempt to redress the scopes of the complex organizational hierarchy. Concerns of individuals, teams, business groups (or other relatively stable organizational divisions) conflict, and couldn't be addressed effectively by the rigid models. For instance, a hardware engineer could not track a design issue with the formal ISO9000 tool, because reporting the issue to the latter automatically escalated it from a design-phase to-do item to a legally visible "incident" (and notified all managers too).

We knew that teams needed to coordinate amongst themselves and track bugs within their own cycle, but management wouldn't acknowledge distinctions for those areas of application. So most of the needs were addressed through skunk-works tools.

Many of today's to-do list Web sites have the flavor of those skunks-works tools. In particular, the sites tend to see the world as if it were a single flat list of categories, items, or names, rather than as deeply interconnected semantic models. Mostly that's because building semantic models is something few people are prepared to do, and fewer still can justify the cost.

Saturday, November 28, 2009

Surrounded by bugs

I had an interesting dream, motivated no doubt by events this past summer in my yard. In the dream, I was walking down a sidewalk in Raleigh, body-pillow in hand, and passed a 2-person crew of city employed exterminators. They were polite, greeting me, and going about their business. Except that they didn't warn me that I was walking into an area they just treated. Looking to my left and right, front and back, I found myself on a sidewalk, surrounded on all sides by what was only now apparent as fire-ant ground.

To the rear near the street, what had appeared to be a baseball-pitching mound sized gradation was actually a fire-ant nest. The exterminators had disturbed it to pump in poison. But elsewhere, around the walk, were ground level spots where they had also disturbed. Fire-ants were just no pouring out in all directions. It was one huge nest. Oh, and I was in sandals.

I had been subject to a "drive-by" by the two workers! It didn't help that they had just poisoned the ants -- it wouldn't kick in for several minutes at least. Now it was no longer a question of whether I'd be bitten, but how to find a way to navigate the path and minimize the number of bites.

Awakening, it occurred to me that this dream was perfectly metaphorical in describing modern Web-based applications work. You're surrounded all around by an infrastructure infested with some really ferocious bugs, any one of which is tiny but the cumulative effect of which can be deadly. And navigating them is a real drain on productivity. Who were the city workers? One represents the well-intentioned standards working group people; the other the technology vendors. (It can be hard to tell them apart sometimes.)

Friday, November 20, 2009

Web developers getting in over their heads

Quoting from a recent posting on a local "Web CMS" group:
"I had a [php webcms] based website, crashed it, and now have a really lame [web host brand] site. I started into setting up a new site but can't seem to stay focused and lack the know how to set something up."

Well, this seemed like such an elegant question, a real teachable moment. The original poster's system imploded under the pressure of an increasing learning curve. My point is that every choice to deploy technology comes with consequences, particularly on-going requirements to maintain the infrastructure.

Open source or not, you need to understand your own needs and constraints first. Never mind that the architectures of Web publishing frameworks are fragile enough that ordinary users can irreparably crash them (an obscure weakness of the O.P.'s particular webcms' architecture).

I won't mention the name (COUGH/DRUPAL/COUGH) , but all the PHP based make-believe CMS's suffer from major architectural flaws and a generally poor fit to efficient work flows.

Thursday, November 12, 2009

The Java Trashbag

Summarized from, this is a list of the things a bona-fide Java developer is supposed to grok.

I summarized this because I'm interested in development. On the other hand, it makes me realize just how much crap people will put up with in order to circle their professional wagons around themselves, and fence everyone else out. It begins to appear very religulous.

Now, I'm not against Java. I like the idea, and I've done some programming in it. But the landscape is looking pretty convoluted. Is it that a lot of former C++/Corba-ORB enterprise developers never learned the lesson of simplicity? The reality is that there are multiple perspectives reflected here, and they don't really come together rationally. (I've commented on that lack of rationality before.) It is likely the case that developers who can check all of these items know little more than surface exposure and cookie-cutter reuse of sample code.

I note too, that the writer omitted any mention of layered or tiered architectures and client-server or peer-to-peer organization. He also compared SOAP to REST, a classical categorical error which is ingrained among many in the Java community. REST is a design philosphy; SOAP is a message packaging protocol. The two are not mutually exclusive. REST practitioners often utilize privately formulated well-formed XML as both the package and payload. Along with URL addressing conventions selected to facilitate caching, this "raw XML" most visibly represents the REST philosophy reduced to practice, and thus surface details are mistaken for the thing itself. REST means that you use the exchange of representations of objects between otherwise stateless servers to accomplish the goals and behaviors of the system. REST is akin to using bakers carts on a factory floor to exchange parts from station to station, instead of keeping everything inside a single complex closed machine. SOAP can be used in a RESTful manner, just as raw XML and URLs are often mangled to implement interfaces to stateful systems.

The list

Java, OOP, and the the core library idioms
ex: What's the diff between using String, StringBuffer, StringBuilder?
Why implement interfaces instead of subclassing abstract classes?

Character of The Platform
Subsets of the Java2 platform: ME vs EE vs SE
Deployment choices: Applets, Servlets, EJBs
JVM Hosted Syntaxes: JavaFX, Groovy, Jython, JRuby, Clojure, Rhino (Javascript), Scala...
JVM configuration and JRE, SDK differences

Enterprise Development Frameworks: Spring, EJB, JPA/Hibernate

Some Scripting Skills: Python, Perl, XSLT, Ruby, command shells...

Basic web service development:
Servlet containers, Tomcat, GAE
ReST and WSDL, SOAP, XML and JSON payload formats

Multithreading: spawning threads, thread intercommunication, thread monitoring

Database Access: SQL, JDBC, JPA/JDO

Web Client Scripting: AJAX, Client-side Libraries (JQuery, YUI, Prototype, Mootools...), GWT

Chose an IDE and learn under it: Eclipse, JDeveloper, IntelliJ...

Build management tools: ANT, Maven, CruiseControl, Hudson...

Accessibiliy? Usability? How about Practicality?

Talking to Saroj led to other thoughts about accessibility in practice as well. She showed a slide set at the Web Design Meetup with the humorous pie chart. Note the yellow and magenta areas in particular.

The shifting sands of Web browsers and server technologies upon which people build their house-of-cards Web sites and pretend-Web-CMS's have limited the potential for truly accessible sites by consuming excessive developer cycles (as suggested by the chart). An even worse roadblock is that the subsurface, founded as it is upon HTML recommendations, is a growing swamp filled with plenty of rotten ideas, many of which were chosen to limit the usefulness and accessibility of content.

Take the DIV and SPAN tags for instance: designers still use them as layout primitives, rather than as purely semantic concepts of "group" and "fragment". I assert that this is still table-based design, just with different labels. People make pure-CSS designs this way, but it is still bad practice. The issue is beyond the idea that one or another screen reader can ignore the markup and more-or-less give something like what a non-vision impaired person might see. The problem is, the underlying HTML formatting architecture is biased toward this kind of practice, so in order to get sites that provide the visual appearance designers are still intermingling markup-just-for-the-purposes-of-layout with semantically meaningful content. Thus we're really not separating "style from content", just CSS style attributes. Much of the layout code is still embedded in the content.

Similarly for guidelines like using an H2 tag before an UL LI
used for menus, from an architectural perspective it is a hack: a menu is not a heading, nor a heading a menu, although both may be considered navigational primitives. Thus we are introducing rigid structures which don't mean what they say, and at that just for the sake of a limited set of reading devices. Which is to say, we are making guidelines to mangle non-semantic HTML to work with a specific vendor's browser because of what it can do today.

I certainly understand the rationale: it is pragmatic. A real tool can read that code, and a person gets fairer access to public sites because of it. That's a good thing. The bad thing is the twisted architecture which then telescopes and reinforces itself by twisting working practices. Like a tree growing around an obstruction like a rock: once grown, the tree remains deformed even if you remove the rock. The BEST practice would be to grow the tree in the open, clear of obstructions. That's what generic markup is about. (And if you want to extend the metaphor, DITA is like training the tree to an arbor. Maybe that's what it takes for farming levels of productivity in generic markup. )

I'm hoping to see more in the ARIA recommendations about this, but given the Web vendors' penchant for throwing obstructions into the language machinery, I'm no longer as optimistic as I once was.

Considering Accessibilty Again

Several months ago, I was considering a technology position at NC State. The position since closed up, but I was fence-sitting anyway. At the time I had the sense that it might be more enforcement or governance than I desired, but after speaking with others I believe I was wrong on that. But I felt that they deserved more than someone who was reticent about the position.

Anyway, the position was for someone who could act as an accessibility guru and technology liaison. This person would help raise awareness of facilitative tools and techniques for making technology services more accessible and usable. Speaking with Saroj Primlani, who had served as NC State's accessibility coordinator for many years prior, prompted me to rethink my rationale.

Not that the position is open any more... it isn't. But I'm not convinced it is as much about enforcement and governance as it is about awareness and efficiency. And that I can buy in to.

Saturday, November 7, 2009

Linking Bipolar Disorder to Societal Change

In my job search, as I peruse the many and varied job listings, I am struck by the cacophonous diversity of voices represented. What used to be called a "programmer" no longer exists in the minds of the people who put together the job listings. Even the era of the hyper-specialist -- "user interface designer" or "back office engineer" -- has been superseded. We've gone beyond, into an area of super-beings: people who can specialize in many areas with several years of experience in the latest technologies. It all made me wonder what effects such expectations have upon the brain structures of people trying to conform?

My first thought is that increasingly complex, dynamic, shifting and expanding points of view would lead to a fractured consciousness. Disorders like bipolar, which may have a learned component, should increase dramatically in times of rapid cultural change. A quick Google search suggests that the rates are indeed increasing. An NIH study paints a pretty disturbing picture of a forty-fold increase in bipolar among youth.

While many may jump to the conclusion that the problem must be due to chemical factors, it is not only chemistry which affects brain function but also electromechanical factors. Neurons and their wires in the brain, dendrites and axons, grow and shrink in response to the challenges with which we are faced in life. That's why athletes don't train by baking cookies, tracking stocks, and learning a foreign language. Instead, they focus on a select set of activities centered around their sport.

Biblical admonitions such as being careful of what you allow yourself to watch and avoiding vain discussions may be thought of as mere moralizing, but from a physiological growth perspective they represent rational strategies for performance optimization and good mental health. So when you hear conservatives express concern over societal issues my progressive and libertarian friends would do well not to just dismiss the complaints as repressiveness.

Your politicians have been promoting Change for Change's sake alone: "Change is Good," it the moto. Your social networks promote excessively rapid interactions and dissemination of half-baked and ill-informed ideas. You already had difficulty reading paragraphs longer than three sentences, but now your mobile communication methods discourage breadth and depth. People who use mobile devices excessively seem to be suffering from a disuse atrophy: their ability to process and formulate long, logical thoughts suffers. Our kids are going crazy, literally. Maybe it's time for a little less "change" and a little more focus.

Friday, October 30, 2009

Why productivity is soaring in this downturn

Wonder why productivity seems to jump during a recession? It is not
just because people are laid off. The real productivity impacts of
laying someone off cannot be measured immediately, because in the
immediate time of a layoff most companies have obligations to continue
paying severance and into the unemployment pot. Also, let's face it: lay
off decisions in big companies are much more about crony-ism and
favorites than about long-term values, successful missions or measurable
objectives. The Peter Principle ensures that those making such decisions
are not quite capable of making them correctly. Failure is promoted,
success is means for discrediting or dismissal; it is inevitably the
PEOPLE WHO MAKE STUFF who get the shaft.

So there's part of the answer. To paraphrase Indigo Montoya, "You keep usin tha word 'productivity' -- I do no thin it means what you thin it means". Productivity is a euphemism for an average that is
completely misleading, GDP divided by number of labor hours. Even if
the numbers used for GDP and labor hours were accurate, which they are
not, the measure is too gross to suggest it means anything in terms of
how a recession affects the structure of companies.

Another part of the answer is intuitive, and embodied in the saying "to cut the fat". Unfortunately, as we have seen in the American experience, the fatheads doing the cutting have conserved the fat, cut the muscle, and in some cases bled the company dry. The government has
done its part by encouraging a service economy. To apply the same
metaphor, this is in effect a body without muscle or bone.

Here's one thought I have heard: the recent GDP figure has been expanded wildly by raw government expenditure. The money Obama has poured onto
non-production areas like the funding of political cronies, is itself
directly counted as an increase in the GDP. Too bad there's nothing
underneath the fiat money to support its value. Worse yet, that the
money did not get spent in such a way to make substantive improvements
in energy or production. Instead, it has gone to political activities
and feel-good projects, marketing and promotion. In effect, it is a
government spending bubble. That's how we can have a grossly overstated
rise in GDP in the middle of a long-term government initiated and
perpetuated recession.

But here's one other thought I haven't heard: what if the "fat" were practices, not people? Over time, institutions get bigger and go
bureaucratic. As I've been reminded over and over at the Web Design
meetup, PEOPLE DON'T LIKE TO BE MADE TO THINK. So during good times, fat-headed processes like ISO9000, CMM, and all manner of Sarbanes-Oxley
compliance motivated programs get funded, staffed up, and pushed into the
practices of everday workers, making the meat well-marbled with fat. In
a recession, the same processes hang on the books, but the meetings get
tossed under the bus and remaining workers find ways in their slow time
to bypass the blockages these "quality" programs introduced. Similarly,
marketing functions which poured gobs of money into Web development slow
the outpouring for a time to a trickle; Web artifacts have the longevity
of a gnat and yet take enormous amounts of cash to publish relative to
their lifetime, so the slow down causes pauses in incredibly large and
unproductive expenditures. But only for a time.

Saturday, October 24, 2009

The difference between Table-less and Semantic Markup

As organizer of a Web Design Meetup, I get to hear all manner of conversation about Table-less Web design from both expert and amateur designers. But their focus is inevitably about what it is someone else has told them about the evils of the <table> tag, and separation of something they call "style" from something they call "content". Before any of the current crop of born-again post-Flash designers came around, we structured markup types focused upon separation of structure and semantics... distinguishing between the syntactic forms and the meanings or behaviors applied. The ill-defined terms "style" and "content", though present in the markup world prior to the Web, have come to be accepted HTMLisms with apparently much narrower meaning.

People think, for instance, that the following snippet (taken from a CSS Zen Garden theme) is wonderful, because it separates something (style) from something else (content):

<div id="container">
<div id="intro">
<div id="pageHeader">
<h1><span>css Zen Garden</span></h1>
<h2><span>The Beauty of <acronym title="Cascading Style Sheets">CSS</acronym> Design</span></h2>

<div id="quickSummary">
<p class="p1"><span>A demonstration of what can be accomplished visually through <acronym title="Cascading Style Sheets">CSS</acronym>-based design. Select any style sheet from the list to load it into this page.</span></p>
<p class="p2"><span>Download the sample <a href="/zengarden-sample.html" title="This page's source HTML code, not to be modified.">html file</a> and <a href="/zengarden-sample.css" title="This page's sample CSS, the file you may modify.">css file</a></span></p>

<div id="preamble">
<h3><span>The Road to Enlightenment</span></h3>
<p class="p1"><span>Littering a dark and dreary road lay the past relics of browser-specific tags, incompatible <acronym title="Document Object Model">DOM</acronym>s, and broken <acronym title="Cascading Style Sheets">CSS</acronym> support.</span></p>
<p class="p2"><span>Today, we must clear the mind of past practices. Web enlightenment has been achieved thanks to the tireless efforts of folk like the <acronym title="World Wide Web Consortium">W3C</acronym>, <acronym title="Web Standards Project">WaSP</acronym> and the major browser creators.</span></p>

<p class="p3"><span>The css Zen Garden invites you to relax and meditate on the important lessons of the masters. Begin to see with clarity. Learn to use the (yet to be) time-honored techniques in new and invigorating fashion. Become one with the web.</span></p>

And the code goes on. Now, I don't have a problem with it as far as it goes. They've done two things here:
1. Separated the CSS syntax from the HTML markup
2. Used most of the HTML markup element types names in the manner more-or-less specified in the W3C Recommendations

But look again at the markup. Oodles of nesting. DIV DIV DIV P and DIV DIV DIV H SPAN ... gosh that looks a helluva lot like tables, just with different element names. Oh, I know, I know, table semantics vary between implementations, and implementations are broken, and there are lots of good reasons for not using tables that don't directly affect other nestings of elements. And both DIV and SPAN are "semantic free" (sort of: DIV has the semantics of a generic block-level element, and SPAN has the semantic of a generic in-line level element). But you are still relying upon the side-effects of the style semantics implied by the nesting of the markup to achieve a style effect. Six of one, three of another, I agree the balance is in favor of DIV. But you've still got markup code that exists just for the sake of style. And that is not cool, not even Zen like.

When you're talking about separation of concerns, the style is all over the code like marinara sauce on spaghetti. The markup, by the way, says that the HTML page has seventeen major content divisions, all well and good because DIV just means a grouping of elements that is (usually) formatted as a block. But I object that "UL" is only barely related to the semantic of "menu", and other pages on the site use a sequence of "P" elements to represent the same menu. Please don't tell me this stuff separates the style semantics from the informational content -- clearly it really only separates CSS coding from HTML coding.

I understand it can be hard for Web designers, even the most experienced, to grok abstract concepts such as syntax, semantics, semiotics, and pragmatics. You've got pages to pump out, and such subjects may amount to mere philosophical diversions to when your concern is primarily producing digital marketing collateral and not performing professional information management. But understand that if something is worth doing well, it is a valuable exercise to understand what happens when go all the way, or at least push the boundaries.

On my soapbox, not only is the meme of "separate style and content" not fully realized in Table-less HTML coding, popular practice reaches precisely the opposite effect. The semantics of style are hard-coded into almost all of the HTML markup tags themselves. People just pick on TABLE because it is a scapegoat: the whole tag set is guilty. Never mind about Web 2.0: the Web has been stuck in beta mode for the past twenty years with a set of page style primitives that were accepted because they were free, not because they were good or even "good enough". HTML is wild and woolly, full of fun and frolic, but one thing it is not is a clean design. And that's my hot-button, because all this talk of separation of style and content in HTML is like dressing a pig in Prada.

Off my soapbox, I observe that the semantics of style include layout functions, what printers call imposition, and what in my experience involves staging functions for information content, not just colors, margins, backgrounds and borders. That's precisely where the Table-less discussion falls apart, because it doesn't go far enough. The bottom line is that "Table-less" is not enough. To separate style from content means so much more.

Friday, October 16, 2009

How collectivists handle blame

You know, I've been observing a pattern of behavior over the past few years, particularly in news reports but I've also seen it come out in the rationalization of some of my friends and relatives. Let me explain the meme using two figures often used to explain encryption algorithms, Bob and Alice.

Suppose Bob is a personality who is perceived as mainstream, affluent, majority, conservative, male, Republican, or having some other right-of-center attributions. It does not matter what the particular mix happens to be: Bob could be a blue-dog Democrat or a liberal-but-conscientious dissenter (like the senator from Connecticut, Joe Lieberman). Bob just needs to be seen as having some non-Liberal attribute upon which to focus. Bob is not a team-player.

Now suppose Alice is a personality who is perceived as marginalized, poor, minority, liberal or radical, female, Democrat, or having some other left-of-center attributions. Again, the particular mix does not matter: Alice could be a documented mass-murderer or an antisocial philanderer (like the senator from Massachusetts, Ted Kennedy). Alice just needs to be seen as posessing some attribute valued by a Liberal group. Alice is a team-player.

Now, when I speak to people about the Bobs and Alices in our political arena, here is the meme I see emerging from the conversations: Bob is personally vilified at the slightest allegation, but Alice could be caught red-handed and it will be claimed that both Bob and Alice are guilty of similar actions.

Alice could be caught with bodies at her feet and a smoking gun in her hand yet in conversation Alices personal responsibility will still be dismissed out of hand and to some extent transferred to belong to "everyone" or "both political parties". The language brought up almost always assumes an orientation of hyper-empathy with Alice, at Bob's expense -- even when Bob is more than an arms length away from having anything to do with the subject of discussion.

I think I started seeing this meme when Bill Clinton's penchant for raping women came out during his campaign. Does that language sound shocking? Re-read the definition of rape, and re-examine his record. The point is, it was his opponents that were eventually sullied with the stain of Clinton's guilt. But that was campaign rhetoric and political mud-slinging. Today, the meme has shifted into ordinary street language, and it is my thought that this shift was deliberate, fostered by news agencies following Democrat Party talking points and strategies for dealing with opposition.

Now, mind you, I'm not saying that if a Bob is guilty, he shouldn't be considered guilty. But if an Alice is accused, she should be investigated personally and we should not be so quick to allow her supporters to diminish her culpability or cast the aspersions of her guilt upon those who brought the charges.

I reject the meme.

Friday, October 9, 2009

Obama Wins Nobel Peace Prize (committee draft)

The Nobel Peace Prize for 2009

The Norwegian Nobel Committee has decided that the Nobel Peace Prize for 2009 is to be awarded to President Barack Obama for his extraordinary efforts [11 days after taking office] to strengthen international diplomacy and cooperation between peoples [with dictatorships]. The Committee [of left-leaning Norwegian politicians] has attached special importance to Obama's vision of [a socialist Amerika] and work for a world without nuclear weapons [in the hands of radical Islamic fundamentalists].

Obama has as President [for 11 days] created a new climate in international politics [among budding Fascists everywhere]. Multilateral diplomacy has regained a central position [displacing the United States' national interests with our own], with emphasis on the role that the [virulently anti-Semitic] United Nations and other [impotent] international institutions can play. Dialogue and negotiations are preferred as instruments for resolving even the most difficult international conflicts [even when U.S. citizens are being killed]. The visionpretense of a world free from nuclear arms has powerfully stimulated disarmament and arms control negotiations [failed and allowed unstable countries to join the nuclear arms race.] Thanks to Obama's initiative, the USA is now playing a more constructive role in meeting [being suckered into paying for] the great climatic challenges the world is confronting [other countries' energy expenses]. Democracy and human rights are to be strengthened Goot! That sounds nice. Pass me another blunt, yah?.

Only very rarely has a person [an aspiring socialist] to the same extent as Obama captured the world's attention and given its people [tyrants] hope for a better future. His diplomacy is founded in the concept that those who are to lead the world must do so on the basis of values and attitudes that are shared by the majority of the world's population [in Communist China].

For 108 years, the Norwegian Nobel Committee [cabal of former politicians] has sought to stimulate precisely that [influence, distort, and buy off the] international policy and those attitudes for which Obama is now the world's leading spokesman. The Committee [of former Norwegian politicians] endorses Obama's appeal that "Now is the time for all of us [all of the U.S.] to take our [the rest of the world's] share of responsibility for a global [every knee jerk] response to global [our trumped up] challenges."

Oslo, October 9, 2009

Wednesday, October 7, 2009

Languages and projects

Perhaps it will be helpful for me to think about projects I have done, rather than recent disappointments. I'll consider languages, and mention projects summarily.

GnitPick, a jython tool which integrated the NUX XML library with others within a story editor interface, complete with a Swing-based tabbed interface and cross-tab XML query capability. The tool was an attempt to evalue Jython as a replacement for 4GL interface builder, and I think it did quite well. Menus were build using Jython callbacks organized via XML definitions, and the models were maintained as XML trees.
I also started on a Google App Engine site called SplotchUp, using Python and the Web2Py framework. Knowing exactly how to get the job done tells me very little about what direction to go with it, so for now I'm stumped -- I need an objective problem to solve in order to keep going. No user stories means no tests, and no tests means no code.
I know I've done more Python before GnitPick, but it was mixed up with support related work. GnitPick itself was part of a larger project to bundle Java libraries together into an AJAX-like app framework, which was never published: at the time, the Java clients for HTML sucked. For that matter, they still suck.
Barcoding scripts using PDF::API2. Scripts to re-write the coding of PostScript files to work with proprietary RIPs on the Xeikon platform. Perl reporting code, literally: perl modules to create output reports for a System Verification Test test-case tracking tool. Various Perl CGI scripts to extract data using the Pear interfaces for business reports. Perl for support purposes for DDTS.
Mentioned below, I also maintained a Perl-based Wiki, Onaima, forked from Dolphin Wiki and hosted on SourceForge. Four groups within the organization used this Wiki, a test team, a development team, the quality/process group, and our engineering support team. For the quality group, I combined the Wiki with a corporate portal. The Wiki succeeded where a concurrent effort to roll out QuickPlace fell completely flat.
Block Print Paginator was my first real C project; BPP counted lines with a one page lookahead algorithm which determined how to best split the lines so as to utilize the page without leaving pages with orphaned lines or too small a margin.
The next was a Project Take-Off application, written for the event-driven GEM user interface library on the Atari ST platform. GEM turned out to be not unlike X/Motif. The application itself was a spreadsheet-like system which presented contractors' yellow-pad process for estimating by take-off.
I did lots of ETL programs for Nortel under contract. Puerto-Rico 911's conversion was mine, start to finish. I asked for an ascii version of the specification, cleaned it up with AWK, then used my scripting skills to incorporate the spec as an input to a Make-based build process, generating the validation and conversion regular expressions along with unit test stubs in an object-based framework (C-structs-with-pointers-to-methods) I devised myself. It was absolute perfection. When a question came up about the stability we could both look at the spec to answer the question (the spec was indeed at fault) and demonstrate functional performance by running the built-in tests, which had been coded first.
I also did ETL programs for Bell Atlantic and another telecom, working in a team of three -- though it would have been faster, more reliable, and better documented had I been allowed to take all the responsibility.
Another project was SMSR: Single Member Set Replacement, in another team of three. Actually, two of us did most of the work; the third was the lead and did virtually nothing except set us back, and eventually try to put me into a compromising position with my employer. I did a transactional update interface for SMSR using Lex/Yacc to define the grammar of the input files and C to define the actions.
A small tool I wrote at Kodak was an SCM user interface using the X/Motif library to interact with sccs. I called it XCCS.
After working on the IRTS project, I worked on a system I dubbed the All In One Search Engine. I began writing this in multi-threaded C++ using inverted indexes constructed from case-shifted words. After a substantial amount of work I determined that I could show results faster by redirecting to the C-basedHt/Dig engine. I hacked the search algorithm and provided extended indexers. The method was to extract XML representations from various enterprise quality databases, indexing and saving links to these XML forms, while transforming into HTML representations. Searches were keyed based a selected semantic XML structures, but results were presenting using the HTML representations. Links to the XML and to the original databases (if still available) were also integrated into the HTML results. The system allowed enterprise users in Hong Kong to access quality records for products built and formerly supported in Raleigh, and transferred to China; it also allowed the US operations to discontinue the original quality databases, saving a 50k license immediately and all future support costs, since records were fully accessible in XML format. In other words, it wasn't just a search engine, it was a quality records clearinghouse. Observe that this engine was integrated in 2000-2001 timeframe.

Look, HTML isn't a programming language. See my other comments on AJAX, XML, and JavaScript. I have done a number of Web sites now, some in Plain Old HTML, but many in Joomla.

While at Merlin, I was the principle author of the Deluxe Business Systems Imaging Composition Standard Specification. The standard was produced to define a markup language approach to interchanging order data for personalized print applications. The content included financial transaction information and document design specifications. The specification was meant to cover a variety of order capture situations, across all of the DBS businesses. This was all at a time when XML was not even a first draft, so we were well ahead of the curve. I produced DTDs and semantical descriptions, as well as a process specification to show how document instances would be used in a product production chain.
Microsoft had access to the SGML-for-edi communications, I'm sure, because they were working with the DBS CTO at the time (for whom we produced the spec.) Also, one of the partners later founded PODi which in turn published the PPML standard, surfacing many of the print-on-demand markup application ideas we discussed in the form of PPML. So I'm confident I had a real, if not very visible or widely acknowledged, impact on the industry.
I joined Alcatel rather than going to Deluxe, because I wanted a little bit lower key position. My family, especially my wife, was pining for time with me. Being an "ordinary engineer" instead of an "executive consultant" gave them what they wanted.
At Alcatel, I eventually got back into a role as a management consultant and XML evangelist, having seen how markup would impact the fundamental coding practices of engineers. This was difficult, because many in management had lived insular lives. They were embedded in Telecom, and didn't understand the broader software market. Similarly, many engineers were unfamiliar with the technology and had difficulty understanding how it could impact them. Still, I gained clients. One network management group began basing their architecture on XML with XSLT; I provided XSLT and XML consulting to them. I also convinced the ADSL test group to adopt XML for test case automation; they said later in a memo -- after I had been laid off -- that I had "anticipated the organizations needs a year in advance". This was in response to a Systems group's recognition that their specification documents were a mess: the test group already had recaptured much of the necessary content and was using it in productive work.
An interesting project at Alcatel involved a configuration tool which emulated a provisioned rack and provided soundness checks as well as quick access to technical reference materials and quality databases. Two key aspects were that the "tool" was actually a set of XML documents created by an AJAX enabled administration interface, and converted to JSON notation by a publishing transformation. The documents were actually checked in to a document repository which provided consistency of identifiers and a means of subscribing to the transformations as a service. The schemas for the XML used DTD notation.
There were others as well. Another "configurator" type schema I helped define was for a sprinkler manufacturer configuration system. Generally, my approach has been to create sample instances and use Trang to create drafts of the schemas in RelaxNG, iterating between the instance and Trang until the RelaxNG schema is structurally representative of what is desired. Then I refine the RelaxNG schema and begin forking off sample instances as test cases to show usage and conformance, dropping into WXS when I've got most of the semantics worked out and need to verify that I haven't introduced something WXS cannot represent.
I've done a number of stylesheet transformation projects, some big and some not so big. Among the first were cookbook stylesheets I wrote for training classes to introduce Alcatel engineers to XML and XSLT. They were way behind the curve on markup languages, and I collected a number of techniques from my previous experience with DSSSL and participation on the XSLT mailing list. I put it all together in a Wiki, another technology I introduced to engineering at a time before Wikipedia came onto the scene.
A typical project was for SPAWAR to demonstrate a "robohelp" like framed output, with a left-pane holding context menus, a right-pane holding content, a title pane, and a notification/branding/status pane at the bottom. Often CSS and/or JavaScript can be used to add effects. Note that JavaScript is not a hard requirement since XSLT mappings provide fixed URLs without the need to perform AJAX type interactions. One thing people ask for repeatedly is how to get XSLT to generate mappings between disparate files -- they often assume the XML "ID" attribute is meant for linking, and are somewhat disoriented when it does not suffice for cross-linking.
One interesting project was for Tekelec. BrightPath Solutions brought me in to provide XSLT, ANT, and DITA training. Following this up, I designed a DITA-OT transformation queuing system and provided an interface to integrate it with the Vasont client. The point was to delegate builds to high performance remote servers, with orderly dispatching of requests. Apparently Sarbanes-Oxley was an issue as well, since controlled access to sources and deliverables required as well.
My exposure to Java is still limited. I worked a bit with it in Eclipse when integrating the Java libraries for GnitPick. But Java can give nothing like the productivity of a properly defined 4GL environment. It retains many of the worst aspects of C++, in which Object Orientation, rather than making code more coherent and comprehensible, makes it instead more disjoint and idiomatic. I supported the XSLT transformation engines at Alcatel with some Java glue coding. I also played with a Swing based "ants" simulation, a kind of cellular automaton in which colored tiles are seeded and mutated according to a set of global rules, exhibiting surprisingly complex emergent behaviors. I used Swing buttons and a simple layout manager, rather than a canvas, for the ants simulation -- a poor choice for performance but it was easier to code.

One of the first samples I wrote for training was a use of a Javascript-driven HTML shell to format and display the XSLT, XML, and XPath specifications in various ways, using an associative array to organize the possible transformations and Javascript to manage the user interface and delegating the calls.
The configuration tool mentioned under the XML topic, was an early but very robust AJAX-style application. The system featured a completely Web-based administration interface which managed the loading of card-specific JSON profiles (created by transforming XML documents) through Javascript. Balloon help popups were provided, as were dynamically composed editing forms with semi-automatic layout. Documents were retrieved as JSON, edited, submitted through a CGI interface to a document repository, and subsequently used to regenerate the JSON representation(s). The front-end consisted of the same JSON files, loaded via a frame-based, Javascript-animation enabled interface. The UI supported right-mouse context menus to fill in plug-in card slots, and red-yellow-green status displays on each card as the method of reporting compatibility issues. The rack itself was defined using HTML block-level elements and tables, with a carefully orchestrated sequence of event triggers to contextualize how objects in a class framework were populated as cards and software options were selected. Also, we provided an alternative pop-up dialog method of accessing the card information at a lower level, including access to quality databases and reference documentation. This tool succeeded at effectively managing an information set which five experienced managers had failed to organize using spreadsheets, and went well beyond by providing a coherent delivery framework and two working, AJAX-based analytical tools, within three months of effort. Keep in mind too, that the browsers at this time were not mature and did not provide the ease of use AJAX libraries in use today.
For Deluxe, one of the development projects we performed was to demonstrate a feasible bridge from the Smalltalk business objects framework to a standard SGML parser library. We selected the YASP parser, and used IBM SOM to bridge the parser into a Smalltalk environment, building objects to represent the information content of our markup.
As mentioned, the search engine I began working on was also in C++.

Maple and C++
Another small project I did was for a college professor, implementing an algorithm for efficient search of dependent parameter subsets among datasets of oncogene features, using STL containers. Although I finished work on substantial parts converting the algorithm from Maple to C++, the professor went off for Europe at the same time I graduated and I did not hear back. It was apparently not all that important to her, but honestly, it was among the more challenging projects I've worked on.

Unix Shell
I've written little HTTP servers in Korn shell, just because I could, and a plethora of other scripts, many simple, some very, very involved.
To support a Web application, I wrote a document repository and publishing system using a suite of Korn Shell scripts, written using an object-oriented approach. Scripts represented classes and arguments provided method signatures. The system could register new document types as well as provide pre-check-in validation and post-check-in indexing, abstracting services, and transformations to create AJAX-style applications. Sadly, management was clueless about the methodology and utterly ambivalent about the whole business division. The system was layered, providing a user-level interface to the Repository and delegation of functions to the document type handlers.
During some down time, between writing ETL programs for Nortel, I did an educational project I called NopGen (nop==no operation, a machine instruction which does nothing). NopGen was a code generator designed around the metaphor of plumbing pipes and couplings. The metaphor arose from a paper I wrote about the factoring of programming dependencies, and a paper on Basset Frames in one of Tom DeMarco's books. I simply extended the UNIX pipes metaphor with a framework to organize the pipes and to parameterize blocks of code as templates. A Bourne shell prototype worked well but was slow and had no consistent way to organize the templates. Most of the functionality was implemented using a 1000 line AWK script, based on an idea of using special character strings $- and -$ to delimit chunks' start and end markers, with generic labels and named parameters. (That is, it was a toy LISP-like language, with features of XML markup and XSLT.) A college professor said it belonged in an ACM paper, but I had no clue how to get that accomplished at the time.
I've also written a lot of Make build scripts, as well as several "install kits" based on UNIX shell with or without Make.

Pro*C/Oracle/PL/SQL or ESQL/C
State tax reports for Paychex. What can you say about that? It was pretty routine stuff.
Utilities to format XML output from database queries.
A lot of SMSR was actually written in Pro*C.
After creating the NopGen language, I worked on a database course at SUNY and produced a project I called GROG: Generalized Relational-Object Generator. Grog was a sort of Corba-like IDL generator. Pro*C was used to access the Oracle System tables, which had been annotated by the use of views to specify aggregated types. The tool output was object-based C-with-structures-and-pointers, and instantiated specialized instances to establish a kind of meta-object protocol. I also added stubs for pre-and-post validation procedures, as well as CRUD style accessors. Then I wrote a user guide and technical reference manual for it, as well as release notes and a shell/make-based installation kit . The instructor informed me that in his mind it was graduate-level work, well beyond what was expected, and re-assigned the course as "Advanced Database" to give me additional recognition.
Additionally, I did a lot of work with my Informix projects with ESQL/C, particularly for the Kodak Gateway project. These included data archival programs, raw data loading programs, and scanner alarm monitors. (Although ESQL/C was an Informix product it was not that different from Pro*C.)

One of my first jobs was with a small Unix reseller, thrown to the wolves as it were on a ground-up accounting software project for a Furrier's operation. The app and database were done with Informix 4GL. I also did a Sales and Support Lead Tracking system for a window-and-door manufacturer, as well as a Manufacturing Job Tracking system, also using Informix 4GL, from the ground up to fit the client's operation. Later I worked on a POS system with another guy, using a more GUI oriented version of the same 4GL; and on a Real Estate Multiple Listing Service Database and financial reports system for another client. All required attention to detail on the database, since most had been started by people who hadn't a clue about database analysis, normalization, or query optimization.
Later I worked at Kodak on the Gateway project, writing a report queue daemon in Informix 4GL and the embedded SQL/C product.
Alcatel's IRTS ISO9001 incident/workflow tracking application was done with Oracle and the Ideo SmartGL language, as well as some C and non-Ideo scripting in Perl for report and notifications daemon. Their Change Control tool was similar. I also supported a System Verification Test tool which managed test cases with a SmartGL interface, and hooks to LabView, with Perl to do reporting. Again, I had to step in there to fix up the indexes, which had been constructed improperly, and the internal clients were very appreciative.
The SMSR project for Nortel used Oracle Forms as a user interface, which I had to seriously work around to get it to behave: the designer foolishly assumed it could handle hierarchical data sets without difficulty. Remember, this was Oracle 6.

I focus on components and templates for content management systems. My biggest project was for a Bosch Security Systems marketing program using Mambo -- an ordering site for branded distributor collaterals. It required substantial modification to the com_content component to integrate personalized previews, the com_users for personalization fields and logo uploads, shopping cart integrations, rewriting a credit card gateway for, and order "ticket" packaging for a pre-press production process. The templates also had to match the Bosch site styles precisely, which was not a trivial task. I continue to produce Joomla components. I wrote one component for bridging QuickBooks customers to Joomla users, modified contact components for various purposes, and adapted a number of other off-the-shelf components for specialized purposes. For CMS sites it is rarely economical to custom program entire components from the ground up since GPL components can usually be adapted with much less effort. See and for two of the best examples.

I have repaired and evaluated technical maintenance issues on ASP hosted sites. The VBScript on them is not any more complicated than PHP, but the closed database environment of MS Access and the "jet" database files can make these sites less productive to work with than LAMP environments. I spent a lot of time supporting the Print On Demand Initiative, an industry consortium, now at The chief need there was first to stabilize the site -- it had been left with database problems -- and then to work with the marketing team to put up landing pages for targeted email marketing campaigns. To let them track the campaigns I included a URL-encoded parameter which then populated a hidden form field when the user clicked-through. Eventually the client wanted to overhaul the entire site, but I was not interested in being a liaison to the foreign team they had in mind so we parted ways amicably. Another site I did support for was a wallpaper pattern ordering site. A client, Merlin, was considering purchasing it. The code and database were a mess however, and would not allow customers to complete orders. It was also not bringing in much business -- building that up was really an SEO/SEM problem -- so after learning about the business and digging into the code I advised against a purchase.

I sometimes do MS Access work. The front-end is a VB GUI layout designer, the back-end is SQL based, and the language is VBScript, so it is really just a 4GL environment, albeit a rather poor one. I say that because it is easy for naieve programmers to get into a lot of trouble creating MS Access based applications that look great but destroy the very data they were designed to capture and protect. One of the bigger efforts was a course evaluations reporting system for St. Augustine's College; the consultant left them high-and-dry with an Access database, from which had been stripped all the report generation code. Only the report formats existed. I wrote import procedures and queries to cross-tabulate and summarize the raw data, and to implement business rules as the client specified. I also wrote validation queries to check the integrity of the data before, and of the report counts afterward. I still help them prepare the course evaluation reports every semester, refining the checks and adding a few features here and there to improve the stability and confidence in the process. I also worked on an insurance rates calculation program while at STI, which was done in VBScript under MS Access, and a couple of other forgettable Access database projects.

I did a lot of serious reworking of the SBT Accounting Modules in my first job, for a client (Utica Steel). The boss sent me for SBT training. The coding was all in dBase with simple sequential record files. I was good at it, but the dBase/SBT environment was not conducive to maintaining the modifications: you were basically forking the codebase.

In view of this reflection, part of my difficulty is in connecting at a social level, and part in treating myself as being at a "tradesman" level. It inherently limits my options.

Feeling defeated

Every time I look at a jobs posting site, I get to feeling defeated and depressed. It isn't just that there aren't a lot of jobs that fit my interests -- there are few that fit my immediate background too. Going to school for a mathematics degree took me away from programming for two years. Even before that, I did part time projects doing XSLT work and XML schema design, but got no Java or .NET coding leads. So I'm S.O.L. for most of the programming postings, where people invariably expect the applicant to have worked on a team doing J2EE or .NET development in some complete project.

The real trouble started at the telecom company. Management there was completely ambivalent about the development work, and seemed to go out of the way to avoid putting together a coherent program. We inherited scads of crap work from engineering, involving perl, oracle, pl/sql, ideo smartgl, html, shell scripts, and c code. What do you do when there's no recognition for anything innovative you are doing from management, despite lengthy explanations, obvious interest from the users you were hired to support, and concurrent validation from trends emerging in the tech literature? In retrospect the whole ship was sinking, and they were much less concerned with the purpose of the voyage than with saving their own skins. In that environment, I devoted a lot of time to doing the right thing for them -- working overtime on antiquated 4GL environments and proprietary tools -- but not necessarily what was good for me professionally.

I get to feeling like I'm some sort of invisible man. No matter how much time I invest in trying to stand out, no matter what new techie hype I learn about, it doesn't feel substantive, and I end up feeling ever more marginalized and forgotten.

I've volunteered. I've worked for nothing. I've presented. I've trained. I've organized meetings for user groups. I've networked and given leads to people. I've called, and find inexplicably that people don't return my calls. Goddamn it, my own RELATIVES won't even return my calls. Nothing new, it has been that way for years. I deal. But I have to ask myself after all these years, before I take another step, where's the payback? WHY should I care. WHAT is in it for me?

Yeah, and that attitude permeates my thought processes now when I hear about some "New and Improved" techie crap someone has going. Same old thing: get people suckered in to something which in the end is, on balance, bad for them. All these social networking sites? Look out. Yeah, I'm a member -- stupid. If you want to drink, you have to visit the watering hole, even if it means drinking where the alligators are swimming.

We are way too dependent upon technology, and it is distorting the way people think. A prospect has been dangling work in front of me for over a month. His messages, sent from a Blackberry, are always cryptic, incomplete, indecisive, procrastinating. In short, stream-of-consciousness. I point out that one time slot out of many will be busy and suggest others. He responds by not scheduling the meeting he specifically asked for. People have forgotten how to reason deeply.

Instead they react. They hear subtle cues and make fantastic leaps, instead of considering nuances. And the Web businesses that spring up to service their needs are there for convenience, to get them hooked, even if it means they lose control of their own business processes. All they seem to want to do is sell and pitch, advertise and market. Knowing means nothing. Producing means less. Excitement and action regardless of the consequences, means everything. I just don't relate to the group think.

I've also gotten inexplicably cynical about the potential employers. I was happier when they deceived me about their relative value to society. Now I see a lot of companies and wonder why they exist. One guy I know sells character outlines. CHARACTER OUTLINES. As in bits of font faces. He gets other people's design work, and converts the artwork to file formats using a program he bought for less than a hundred bucks. Then he sells the output, even has a catalog and a blog. The market is made up of people using the bits for embroidery work. So his Web site is a commercial design resource. He is selling the output of a program other people could for get themselves, if they knew better. Hey, at least he is doing something with some meaning. When I think of potentially working at a bank, insurance company, telecom (done all of those), government, or any of a bunch of other types of companies, I get to feeling singularly demotivated. I feel like they really aren't worth it.

So as a friend of mine said when I asked him what the interesting problems were, "I haven't found a way to do things that are both interesting and which pay". It is incredibly depressing to contemplate that. So far it has held true. Makes me want to just toss this laptop, throw the company in, and just give up. What's the point, if I can't enjoy what I do?

Sunday, October 4, 2009

I don't deserve to be alone

I get it,
you're just coping with yourself.
Your protege has you all worked up,
is it denial or sheer cluelessness that you cannot see your own obstinacy through hers?
And why does empathy so often elude,
why is it so hard to see,
that she is you, and you are she?
I understand,
that you just want to be left alone,
to wallow and stew, isolate and vegetate,
to read and view by yourself in your disquiet.
It is not enough that I am kept to myself for the week-end,
the Untouchable. but
must I also be without the comfort of human discourse,
my name not mentioned,
the very thought of me also forgotten?
are you so oblivious,
that my soul dies within me,
starved for the simplicity
and warmth
of stimulating conversation?
That you cannot reply,
I do not fault you.
It is too much to ask, I know.
The void cannot reciprocate feelings,
but you pretended well, once upon a time,
had me convinced,
that we walked a path together,
that I was not alone.
An indiscreet message between sisters put lie to that tale,
the pretense found out,
wanting empathy yet finding none within yourself,
you showed me your best mimicry of comfort.
Now even that too is gone,
the facade too difficult to maintain,
crumbles with time.
I reached out this week-end, and found the void.
You were not there for me.
I do not deserve to be alone.

Thursday, September 24, 2009

Feeling dejected

Looking for work. Just feeling bummed. How to find a position that will pay enough and something with a smidgen of meaning? Never mind that -- how to make meaningful connections with people?

I truly wonder. Networking in the area of programming too, seems to have been a complete dead end for me. Somehow after all these years, despite a bruising effort in school, working on projects part time, keeping up with the Web Design meetup, I keep ending up in the margins, being forgotten, and outsider. It isn't surprising -- it is pretty much a pattern in my life since I was a kid -- it just seems inevitable when it really shouldn't be.

A recruiter commented this morning that she was looking for someone who lived and breathed technology, who had a passion for the work. Mmm-hmmm. I've heard that expression before from peers who encouraged me to do open source projects. I get being passionate about what you do, really! And if you looked at my stacks of books and magazines, loads of notebooks with scribbled ideas and designs, and laptop folders filled with libraries and project work, you might get the idea that I have some enduring interest in the field. But I got the impression that was not precisely what the recruiter meant.

The recruiter seemed to be using a euphemism. That is, they wanted a person who would work on projects night and day, long hours and without adequate pay. Someone who is entrenched in one technology and completely two-dimensional in everything else.

Sigh. That isn't me. I tried to broaden my search recently, but most of the opportunities have been shot down for basically that reason. The more that happens, the more it makes me wonder if I should be dropping out of the field, and doing something completely different. That's why I feel a lack of passion. I need something objective to motivate me, something with meaning.

It is just getting a lot harder to envision what kind of opportunities I should go after.

Wednesday, September 23, 2009

Being a _____ Programmer

A recruiter called today, asking if I was a PHP programmer. Sure, let's go with that. Truth is, when people ask questions like that, they are asking if that's all I've done for the past several years. But for the past couple of years I've been studying mathematics. So the general answer to most of these questions is no, I'm not a _____ Programmer. I'm just a programmer who has done some work in _____.

And it hasn't been all continuous. I spend weeks here and there working in two or three programming languages at a time, on completely different types of projects. Then I've moved on. "Why," people ask, "don't you stay on one project longer"? Well, when the project is done, you move on. I don't believe in doing projects just for the sake of building IT cruft to circle the customer in, make them pay for junk they don't need. To my way of thinking, people could get away with about 20 percent of the technology they have, and would be better off without the other 80 percent. So I spend some of my time destroying technology. That's as much a description of my past work as anything else.

Take ETL for instance. I've done jobs for Nortel, converting telephone listings for exchange between their directory services products. These were one-of, batch-mode projects, with big buffers, logging, audit trails, and programming to maintain transactional integrity. Completely tested. Stuff you'd use on a 911 system, and we did. The whole point of the system was to be able to say with confidence, that the old system could be turned down, shut off, and eventually discarded. That made the sale possible.

Or PHP. I hate it when people ask me, in not so many words, if I'm a web monkey. No, I have not spent years pretending to do serious Object-Oriented programming with psuedo-object-based template language evolved from a hacked version of Perl. I have coded with PHP in MVC frameworks though, mostly the code in Mambo/Joomla components, modules and templates, but some other stuff as well.

Let's see... the first I ever built was a personalized print-ordering preview component, using a Flash movie to show the look of a client's logo and address on a piece of printed material they could order. I also build a bridge component in PHP/MySQL code to migrate customers from a Bosch Security corporate feed into that system's user table. That interface had to be transactional: we had to update and disable client records as well. So I wrote a differencing algorithm, polling a REST XML URL they provided to source the data twice a day, and applying the transactions in SQL. I really hacked the Mambo core components up for that project, so we could integrate in an existing shopping cart's PHP code. It took a while, but I was able to work the Mambo template to conform very closely to the Bosch Corporate Web site style guide. In the end the transitions back and forth were seamless (we used no disclaimer because the users were restricted dealerships signed in under a special program login).

For the Bosch site we needed to tie back specialized marketing print collateral articles to back-end templates, front-end previews, and a pricing matrix implemented using shopping cart product catalog codes. The way this was done was by introducing semantic tags inside the normal Mambo product articles. The semantic tags carried enough information to do the linking, and were stripped out before they got to the browser. That way we avoided having to fork the stock Mambo com_content table structure. We still needed specialized functions for the previews -- it would be a lot easier with Joomla 1.5's template overrides -- but it worked pretty flexibly.

For production data feeds, instead of using REST I deployed a GIT-like tool as a means of archiving and transferring orders from the front-end site to the production server. This was nice because we could recall the orders for a given time period and rebuild production queues on demand, using the scm's commands, anywhere we wanted.

When I started consulting I contributed a Joomla site to a local Chamber, and built a PHP/XSLT/SQL component for that site. They needed a way to maintain their customers on their Web site. I put together a Joomla Web site, with lots of components specially found to meet their needs for advertising, searching member's profiles, posting special feature articles on the site, and so forth. It was in the bag, DONE. So for icing on the cake I wrote a bridge component to let them dump their QuickBooks Customer Database to the Joomla user's database. Been there, done that, did it differently. This time, I took an XML file and wrote an admin interface to let the file be uploaded, transformed via XSLT into MySQL code, and applied it directly to the database. Batch mode stuff. It wasn't the prettiest, but it worked swell. An advantage here was that the Chamber could have maintained all their membership info within QuickBooks, including categories, and we could robustly populate the Web site with new features in a snap. They had that all, then out of left field asked for an ISV to bid on a new site. The ISV got 6 grand to build a new site, but provided no means of updating the most important advertising content. And the Chamber now has to update the membership list manually. Too bad for them. That's what I get for volunteering. (See my swimming with sharks post elsewhere on the blog).

I've been somewhat conservative with the NC State Mindset Project site. The three most critical issues there were (1) running Joomla in a restricted hosting environment (which was incredibly difficult), (2) making sure the site passed accessibility checks and (3) not introducing unmaintainable code for a research project with limited resources. I did copy and adapted a few Joomla components to meet the unique needs of the project -- a help request submission and tracking component, a personal journal entry component, a simple forums component, and a plug-in to go along. Those adaptations were straightforward and did not impact the core Joomla codebase. What was more of an issue was that certain PHP functions banned in the hosting environment were buried in core code. Some changes to a few components' MVC models and controllers were necessary, and to the Joomla core library, leading to a need to migrate patches forward for every security upgrade. Joomla 1.5's template overrides help here but do not eliminate the issues.

What else, what else... ? I think that's it. I piddled with PHP sometimes at Alcatel, but it was small time stuff. I had more serious Oracle database ETL tasks and 4GL application UI work to concern myself with there. In an enterprise situation often the best strategy is to rely directly only upon the most stable, high-performance systems, and isolate, insulate, and marginalize the rest of the systems through batch-mode or REST-style interfaces. Doing so is akin to normalization with respect to a relational database: you ensure that the solution is structured with the least volatility.

Tired of Fracking around because of Internet Explorer

So, I was just looking at a form that worked well under Firefox. It looks like crap under IE 6/7/8 -- all the select boxes are mutilated and there's no accessible way to fix it.

Yeah, yeah, yeah, you tell me IE is used by everyone and his brother, so I've got to make my Web pages look good in IE. But I'm not going to. See my previous post. This qualifies as completely pointless effort. I'm not going to change my content just to accommodate the gratuitously introduced defects of an antiquated product.

If a Web page looks like crap because I didn't design it well, I'll fix it. Otherwise, if it looks like crap because the user insists on using the crappiest browser on the Web to view it, I don't have the time to waste. I'm not going to keep trying to teach this pig to sing.

I refuse.

Tired of burning away time

As a techie, I've learned and played with countless technologies in my career, and worked on a lot of projects, some important but not forgotten and some completely pointless. All without substance: without form and void.

Remember IRTS, SMSR, TAXPAY, ICSS, GATEWAY, PRTS? No? You wouldn't. They're gone, all faded away. To what body of knowledge have these corporate business projects added? None. What understanding of professional practice have these projects advanced? Well, some. Perhaps like the beating of the butterfly's wings a few like ICSS made some subtle impact, but the effects have long since been obscured and obliterated by non-competes and proprietary information agreements.

I don't think I want to sign another "proprietary information" agreement or "non-compete" again. You'd have to put me in the next pay grade for sure.

Then people suggest I work for free. Open Source is fine and dandy, but I cannot eat, pay my mortgage, cover myself with insurance, educate my child, or in general, live, for free. And I have considered open source. Yet I haven't found a project that is worth my time, that isn't almost completely pointless.

And why is it that, as a developer, I am the one that needs to get excited about other's projects? People who are altruistically committed to what it is they do, frequently do not get compensated adequately. Companies like IBM and Red Hat benefit enormously from the donations of developers. That's OK, they can benefit, but many of the developers got nothing substantive in return.

If you've got the money, the need, and the interest, then it is up to YOU to excite ME. Convince ME that your minuscule, self-important activity is worth my time and attention. How does your organization benefit society, the body of knowledge, and my interests? Why should I care?

I don't want to spend another iota of my time or attention on pointless projects. It destroys my very soul.