Thursday, April 29, 2010

Thursday at Future Web / WWW 2010

Another exciting day at the conference. Dana Boyd gave a great presentation on privacy and the social science implications of big data social networks. 

Now I'm sitting in on the social networking panel. Zeynep Tufekci says, I'm paraphrasing, "The most severe punishment short of killing someone is socially isolating them"; that cognitive ability may have developed as result of the need to maintain a mental list of friendships and alliances. It makes me wonder, is this why so many smart engineers are so inept socially?

The panel points out that before social networking, most speech was ephemeral, and disperse and disappear over time. In contrast the stuff we blog about tends to hang on, and in some cases builds increasing weight over time.

Tufekci makes another interesting point - Americans are increasingly without "close" friends. Even familial relationships aren't as high of a quality. How we gain close friends is an important question. Theories abound as to why we might not - we are spread too thin; the "non-people" who aren't good at using Web social networking, or refuse to engage in discussions using the popular tools. If you don't participate in the internet or are uncomfortable with that mode, then you are excluded and marginalized from the important social interactions. Tufekci claims that inheritance based relationships are giving way to affinity based connections.

Wayne Sutton points out that the effect may also be increased due to the existence of privileged social networks. Two people may have almost the same experience in a physical setting, yet one obtain perks that the other does not due to participation in a social networking system.

The ongoing discussion brings up stray thoughts, that social networking facilities will be the place for registering and maintaining identity certificates. They will be the interface by which our extended lives are run.

That reminds me: during this panel, and other talks, references to pop culture media abound. The MATRIX is popular, but there is also Soylent Green. I surprised that the Borg episodes of Star Trek are not mentioned, but I suppose they are too dated. A comparison that is being missed is to those Trek episodes in which some character's higher brain functions were transferred into, and eventually subsumed by, an expansive computer core. Unlike those characters, no advanced alien culture will be stepping in to decouple our kids from the mutagenic network in which they have grown entangled.

I try to ask a question, but the time for the panel has expired. I ask face to face instead: when will social networking systems expose an explicit and controllable policy for data retention? After all, forgetting is an important function for maintaining mental health. None of the systems make such accommodations.

In the afternoon, I sat in on the Core Values of the Future of the Web panel. Burners Lee can't make it. There are a few interesting bits, but my attention began to wander when Danny Weitzner runs on in his opening.  I hung tight.  The panel is dancing around subjects so vaguely that the discussion might be made of aerogel, also known as solid smoke. You know there is something there, but it is very hard to pin down just what it is.

Wednesday, April 28, 2010

Sitting in on the interview w/ Berners-Lee and Weitzer

This is sort of a "fireside chat". Very political overtones to the discussion. Not sure what to make of it. The process has always been politicized, but as the Web becomes increasingly important to public and private life, I wonder whether control over the direction of government will be further wrested from individuals and our local civic institutions. 

But my battery is running dead, not having been charged for long enough. I'll have to blog retrospectively.

OK, the break is almost over and the next interview with Vint Cerf almost ready to start, and I have a plug now.

A side note about the food. After the awful lunch I'm still not really hungry, but it is a little off-putting to see others wandering around with food. It is like the tale of two Webs, one a Web of "haves" and the others of "have not's". I know we got a bargain with the pricing of the Future Web, but the venue could have provided better catering.  Most of the meals and coffee breaks during the workshop were great, but there were problems. I'm not sure the convention center management did the right thing by offering stale rolls for lunch.  And considering the hit-or-miss state of eating down town, the Future Web attendees could have been offered something more than a note that "you're on your own"  for lunch. Constructive recommendations for instance, or an optional lunch ticket for purchase. 

Raleigh's Downtown Sucks for food

So, lunch at WWW 2010 was absolutely horrible. First, because FutureWeb didn't include it. Second, because you couldn't add it on as an option. Third, because Raleigh's downtown has few good restaurants, and no good pizzerias. Unless you consider hole in the wall bars good food joints.I stopped at one little place, named "Niro's". The sign out front said "pay first", and I made the mistake of doing just that. After two bites I had to spit it out.  I threw it down in disgust but the damage was done. I lost my appetite.  Word has it the meal served to the regular attendees wasn't much better.

Raleigh, is this really the way to impress the visitors?

Other notes from the plenary

One more note about the "bit rot" observation by Vint Cerf. Living systems, at least higher order organisms, uniformly exhibit a kind of design pattern for forgetfulness and remodeling of structural components, in particular decomposition of parts that are no longer operating in a satisfactory manner.  We forget things. Our bones break down and reform. Our hair falls out. That is part of the correctly working design.

What most information system planning lacks, and what is most often missing from information systems themselves, is an apoptosis function. We need a "programmed cell death" aspect interleaved within architectures in order to avoid an otherwise inevitable overgrowth and the need for undesirable necrosis.

Mind you, I'm not meaning that we need garbage collection. We already have systems that do garbage collection.  We also have systems that serve an immune function, but we don't handle that very seamlessly. What I'm really talking about is architectural styles which lend themselves to ongoing modifications while neither introducing new fissures nor forcing parts to hang around forever.

Somethin Somethin dub dub dub 2010

As I write, Vint Cerf is mumbling at the podium. Ooops, there he goes, now he's unstuck. Cerf is speaking to the underlying connectivity of the network infrastructure.

Mobile has 4.2 billion users, 1.3 billion PCs. Follow that traffic trend... Asian countries have over 750 million with only 20% or so exposure... the growth opportunity. The shift to mobile returns us to the time when the network was not constantly connected. This is happening at the same time that the expectations for the interfaces are increasing -- photo, audio, geo-location -- smart application behaviors will require the software infrastructure to perform much more coordination and negotiation... REST appears to be a much smarter integration methodology than a sloppy stateful architecture.

Near term changes he suggests: IPv6, internationalized domain names, security for the DNS system, and digitally signing for the address registration; the openness and laxness of router security has become a major area of risk. The addition of consumer appliances as internet connected devices, and the inclusion of local area sensor nets.

Onto the Deeply Linked, Semantic Web: to a markup geek, it is nothing new at all.  Web publications. Persistent public identifiers. Perhaps it is telling that a past that was heretofore shunned by the Googles of the world, is now considered to be its Future?

Cerf went on to speak about security,   but much of what he covered really dealt with the promiscuity of sloppy practices on the net.  Lax behaviors in social networking, unsafe configurations, lack of diligence by governments and businesses, and invasive devices all contribute.  "If anyone is looking for a good thesis, the problem of determining if a configuration is bad is a hard problem." The site "StopBadWare.org" set up w/ intent to clean up the internet of bot nets.

New technologies he covered include Flow Routers, Map/Reduce (functional programming over cloud computing resources), cloud collaboration; the vision of an invasive introduction of sensory devices presented was as much a replay of the Star Trek - The Next Generation "Borg" episodes as I've ever heard.

He notes that Moore's Law no longer in effect, due to effects of heat. Our bloated software can no longer be made faster by faster hardware. (But don't forget some of the lowest level of software is written to make up for mistakes in the hardware.)

Cerf closed with several important, but again historic problems. The IP problem -- the Web being system enacted as a massive number of copies of data. The possibility that our data is encumbered by the computing systems for which it was originally created, and the concern for long-term availability of the data and the software. Similarly, the creation of artifacts for which no reasonable access can be made without the massive resources of the cloud itself. How to preserve and replicate the environments as they evolve?

At the end of the talk, the last question asked was by a woman about Cerf's twice referring to "bad configurations". His answer struck a raw nerve deep within my soul. We need a system -- an algebra or calculus -- for detecting bad configurations. I wrote just that sort of application several years ago. Might be a big opportunity for me.

I almost didn't register for the conference. My cynicism thus blunted, I look forward to the rest of this event with more optimism.  

Attending FutureWeb 2010

Against my better judgment, I decided to attend Future Web at the WWW2010 conference. Mainly, because it was cheap, it was local, and they're offering free coffee at the breaks. Seriously, it seemed like a good deal to be able to rub shoulders with others in the deeper Web domain.

Right now it is the opening Plenary. They've been speaking for 10 minutes already when a young lady with a backpack sneaks in and sits down beside me. A few moments later, having gotten the import of the speaker, she did physically what I've already done emotionally by opening this blog page (walked out). They're talking about the decisions they had to make in organizing the conference and selecting papers. Not exactly designed to capture the imagination. Then again, Future Web is really just an excuse to attend the conference without attending any of the arcane technical presentations.

As I finish this, the third person sits down beside me. Fortunately, Vint Cerf is about to speak. Let's see what happens.

Tuesday, April 27, 2010

WS-REST 2010

A facetiously named workshop on REST was held at WWW 2010 yesterday. After a day of interesting papers on the distinctions of REST and non-REST architectural styles for Web Application Programming Interfaces, the thing which struck me as most important was the recognition that (a) although it may affect them deeply, users don't care about the finer points of the architecture so long as they get an API that works and (b) there is something wrong in what the community understands about the theory and application of links within the payloads of Web APIs.

I did not mean to be impertinent, but I had to remind the WS-REST panel at the end of the workshop that much work had been performed in the SGML world on the subject of addressing and linking, in particular HyTime.  The panel members were quick to acknowledge the prior work.  One subtle rejoinder made was that HyTime was developed for interactive applications, not for programming interfaces.

I let that comment slide when I should not have. While ISO10744:1992 is hundreds of pages and not necessarily the standard to solve their immediate problems, HyTime was developed to provide guidance on markup language architectures, not any specific applications. A substantial fraction of those pages are relevant to the linking and addressing problems however, and much research effort was put into thinking about the stability-over-time-and-location problem faced by REST adherents today. Standards that go around, come around.

Wednesday, April 21, 2010

Electric Car Fantasies and Other Government Run Debacles

I attended a pleasant and slightly inspiring Science Café talk last night at the Irregardless Cafe.  Rogelio Sullivan, Associate Director of the Advanced Transportation Energy Center gave a short overview of the conditions and challenges surrounding the development of electric cars, and then fielded questions from the crowd.


My own interest stems from a general curiosity about the subject from my childhood years. Battery operated cars, hovercraft, monorails... this was the stuff of Popular Mechanics/Science/Electronics. Well, it was popular. It has just never proven realistic in an engineering sense.


Quite the contrary in fact has happened. One thing that stood out above all else, at least to me, was how often Diesel power came up. Sullivan mentioned it first, alluding to concerns over diesel emissions, and saying that the US market never embraced Diesels. I have heard those templates repeatedly, that we don't want Diesels and they pollute -- I've heard them so often that I believe they are clichés to those who repeat them. 

During the meeting, several more people asked pointed questions about the safety of electric cars, as well as about the economics, environmental impacts. Diesels came up again and again. On any one of the issues, a pure electric car would be marginal, but most electrics fail miserably when you consider more than a few of the issues together. For instance, Sullivan pointed out that it could take decades to migrate the electric grid to renewable and/or nuclear sources, and in the meantime electric cars would be generating pollution through the coal- and gas-fired generators currently supplying the grid. In contrast, renewable Diesel is here now, and requires no disruptive infrastructure changes.

After the meeting I remarked to Mr. Sullivan that something seems to be missing in the explanation as to why Diesels are not here. I have yet to see a credible analysis of the economic trade-offs between small diesel engines vs. electric, that shows the electric vehicle (EV) emerging as a better fit solution. Pollution management nightmares created by huge numbers of spent battery packs containing heavy metals, explosive risks of novel high energy density battery technologies, poor end-to-end return on investment, chicken-and-egg problems for bootstrapping supply stations, high immediate manufacturing costs,  ultra-short travel ranges... the vision of a pure EV is a REALLY LOUSY PROPOSITION. 

The old saying is "that dog won't hunt" but in this case it has been clear for years that this particular dog has no legs to stand on. One wonders why academics keep shoving the poor thing out into the field. No amount of government subsidy game playing is going to put legs on that poor crippled hound.


That, by the way, is not to suggest that Mr. Sullivan's discussion was not balanced overall. As he observed, we will end up with a mix of technologies on the road, including hybrids and diesels. It is just my prediction that on balance, large scale use of purely electric vehicles will prove to be a dangerous and uneconomical boondoggle.  Renewable biodiesel would be a far wiser path to follow.


In all, it was a satisfying night. I was hoping for something deeper, more material, but that was an unreasonable expectation to place on an open invitation event.


By the way, if you've been to Irregardless before, you know the food is high quality. It isn't a value-priced menu but it is quite reasonable for a nice night out. You should check out the remodeled interior. It is a wonderful venue for this kind of talk. The event was sponsored by the NC Museum of Natural Sciences and the Sigma Xi society.

Thursday, April 15, 2010

On Semantic Meaning, and Other Muddled Concepts

So, Smashing Magazine has a very articulate writeup on how developers can paint themselves into a corner with DIV based designs, nearly as well as when they were abusing TABLE tags to do layout.

The anti-pattern of using semantic markup for layout keeps reappearing due to a lack of an analytical mindset among designers, whose main strengths are in visual synthesis and communication.

Not to nitpick, but in describing the situation, the Smashing author says:

When developers do not understand the semantic meaning of other block-level elements, they often add more div tags than are needed.
While I agree with the sentiment, such a quote does make one wonder if the author understands the meaning of "semantic". It reads as a bit redundant.

I've made the same mistake myself. It isn't the semantic meaning of block-level elements we need to worry about, it is the semantics -- the meaning -- which we infer by means of the markup, with which we should be concerned.

In other words, we worry too much about precision in terms of tag names, and not enough about whether we the constraints we've put on the vocabulary will allow us to express the distinctions we need to make with necessary and sufficient accuracy. (OK, that is probably a less clear explanation than the previous sentence, but I'm leaving it. The reader will have to work for it here.)

HTML does too many functions with too few tags. The layout features in particular should be marginalized into the realm of backwards-compatibility features. Distinctions like "block-level" and "in-line" are the semantics of a formatting process, not of content.