Wednesday, May 31, 2006

A Vista on Desktop Search

It's the end of May 2006, and Microsoft have recently released the new Beta 2 test version of the Vista operating system (a.k.a. "Windows NT 6.0"). If you're short of reading matter, here's a link to the Windows Vista Beta 2 Product Guide available as either a mere 60MB Word document; alternatively in Microsoft's new XPS format, viewable in Vista or the free XPS reader (which I couldn't locate).

Good luck to Microsoft, they'll need it since it seems Vista still needs significant clean-up work done to address various issues like the ones mentioned at longhornblogs.com (notebook PC running hot, low battery life, driver problems, large disk space consumption, etc). They have four or five months to clean it all up before they get to the Release To Manufacturing stage late this year (in order for product launch early in 2007). It looks like they'll need every day of beta testing to sort out the myriad driver problems, and others. I'm glad that it's not me!

Windows NT4.0 operating system suffered badly from failures (the infamous "Blue Screen Of Death - BSOD") caused by poorly-written device drivers. With its successor Windows 2000 ("Windows NT 5.0") Microsoft introduced a device driver validation program, and that seemed to markedly improve the reliability. My experience is that Windows XP ("Windows NT 5.1") improved on that again, to the point where BSOD happens very occasionally (a few times per year), so well done Microsoft!

At this late stage in the Vista beta cycle, it seems curious if not inexcusable that device manufacturers are not rushing to update their device drivers and ensure that they're all included in the betas. (See Vista Beta 2: The Return of Driver Hell and Driver Hell Avoided... For Now )

For me, with the sorts of things I mostly do, Windows XP works quite well and Office 2003 has far, far more functionality than I need. While I rushed to install Windows XP at the earliest opportunity, I'm not yet sure if there's a convincing argument to do the same with Vista (and the same for Office 2007, even though it has some nice usability improvements). I expect that corporates will be even more reticent. Interesting times ahead for Microsoft to convince the vast masses to upgrade. It will be a hard sell. I wish them well.

If Vista (and Office 2007) prove to be so compelling enough to justify the upgrade costs (money, time, effort, learning curve, frustration) then we may all have benefited. Time will tell. From what I've seen and read so far, I'm happy to stick with Windows XP for quite a while because it's very reliable and "good enough" for what I want to do.

Here's a review by the CRN Test Center Windows Vista Beta 2: An Improvement?

And yet another one, by Dr Dobb's Journal: Windows Vista Beta 2: Great Search, Improved Security, Hardware Snags This review has a glowing report on the new built-in search capabilities of Vista, and if it's as good as they say then this alone might convince me to do an early switch from XP to Vista. And here's the reason ...

MY QUEST FOR DESKTOP SEARCH:
Over the past year or so, I've been testing or researching (via demo or reading) a number of "desktop search" products for personal "power user" activities. These have included: Copernic Desktop Search, Blinkx (no longer available for download, it seems), X1, Intellext Watson, dtSearch, ISYS, Verity Ultraseek). I wanted to discover the "best" one that:

  • Is free, or quite inexpensive.
  • Allows its index to be stored other than on the C: drive (to keep the C; drive as small as possible, for backup/recovery purposes). Since I have well over 10GB of files to index (more than 5GB of IBM Redbooks PDF files alone), this means an index size of considerably more than 1GB (perhaps even in excess of 2GB), which is definitely something to keep well clear of the C: drive.
  • Does not crash when indexing files of all types (Copernic and dtSearch failed this test badly, and I was unhappy to waste time debugging their indexer engines)
  • Makes it easy to select the drives/folders/filetypes that are to be indexed (no tiny, inconvenient, fixed-size windows that have to be once for each choice).
  • Allows you to set indexing to run at low priority so as not to interfere with other desktop activities.
I was a bit surprised myself about it, but for me the winner out of all these was Windows Desktop Search (MSN version is here, Enterprise version is here) in conjunction with a range of free Ifilter add-ins from Citeknet to handle PDF files, ZIP files, etc (see also: More MSN Search Toolbar Search Add-ins ). You could do far worse than this combination. Feel free to contact me if you want to find out more details about my experiences with the various desktop serch products that I tested.

If you're interested in desktop search, then take a look at this recent Dr. Dobb's Portal article: True Desktop Search - "Finding what you want when you want it is often easier said than done. Luckily the lines between the desktop and the Web are blurring—and the race is on for the best desktop search tool."

Friday, May 26, 2006

Leery about Theory?

A little theory goes a long way, at times. While I'm not by any means a full-time database practitioner, I've always had what some would call a dilletante's interest in DB theory and practice.

One of the most frequently visited Useful Links pages on my web site is the one that assembles a dense collection of database theory and practice topics:

In distinct contrast to me, a real enthusiast is Alf Pedersen, who has a web site Sharing Knowledge of Highly Efficient Database Design dedicated to database. There are free and inexpensive database eBooks there. (I've never met Alf and I don't have any commercial relationship with him.)

And he's just added a new article that's really worth reading:

Database theory and practice: Advice and investigations

Tuesday, May 23, 2006

The Distributed Computing Fallacies

I've been involved with performance, networking and other similar issues since the 1970s, so rather willingly accept the truisms in The Eight Fallacies of Distributed Computing by Peter Deutsch. (The eighth one apparently was added by James Gosling. I notice that Wikipedia expresses an alternative view of the attribution of some of the others, but surely you don't really believe anything in a wiki, do you?)

Ingrid Van Den Hoogen has written (in January 2004) about them in Deutsch's Fallacies, 10 Years After and various others have commented about how universally they apply.

And now (May 2006) architect Arnon Rotem-Gal-Oz has expanded upon all of these fallacies in his weblog, and you can get it a single PDF document -- but I found that the embedded URLs in this PDF are not live, so for your convenience here are links to his individual articles (which of course have live URLs). These are at DDJ, and free registration may be required:
Some good reading there, kept up to date with mentions of web services, etc. On a completely different issue, Arnon has also asked Should Architects Code? with follow-ups Should Architects Code: Round 2 and Should Architects Code: Round 3 -- plus The 7 Deadly Sins of Design

Finally, Mark Baker makes a few points in Web Services and the Eight Fallacies ... About "the nature of distributed computing on the Internet and how it is inherrently (sic) different than elsewhere" and that "HTTP defines the single most general coordination language ever developed". Read this article and see if you agree with his conclusions.

Monday, May 22, 2006

Migration between Microsoft Office and OpenOffice formats

Here's a useful and brief guide by Solveig Haugland about how to format your word processor documents for easier, more faithful interchange:

Smart formatting for better compatibility between OpenOffice.org and Microsoft Office

Thursday, May 18, 2006

Being objective about database replication

One of the key original architectural elements of Lotus Notes is its ability to replicate documents across different copies of the same database. [They copies must explicitly be set up as "replicas" of each other. If you're unfamiliar with this, see brief introductions at The History of Notes and Domino or NotesDesign and Notes Architecture, Internals and History or its mirrored page Notes Architecture, Internals and History for even more links.]

The Notes replication process is outstanding. It ensures that the different databases are kept in synchronization. It has stood the test of time since the late 1980s or early 1990s, and from Notes R4 onwards got even more granular with field-level replication (only individual fields that are newly created or updated or deleted need be replicated, not entire documents, a major performance booster). Notes databases have always been "rich-content document oriented" and -- unlike hierarchical and relational transactional databases -- by design no record locking occurs to prevent concurrent updates. Therefore when a given document is updated in multiple copies of the database, at the next replication cycle you can get document "replication conflicts" (multiple versions of the given document containing different fields and/or field contents). In essence, these conflicts must be manually resolved.

I've been deeply interested in database architectures ever since 1978, when I went to the IBM Rochester, Minnesota, development laboratory to gain pre-announcement training in the IBM System/38. This introduced a machine-based (microcode level) database which was essentially a relational database built in, below the operating system level. Don't get me started on the unique S/38 architecture, which later got enhanced and morphed into the IBM AS/400 and more recently the IBM iSeries Server -- now rebranded the IBM System i (what brand name next year?). And how I believe that Lotus Domino on this server is "the best of all possible worlds" ... Someahere along the way, the unnamed built-in relational database picked up the DB2 moniker ... I could go on forever!

Although I don't specialize in database, I've kept up my interest, and try to keep abreast of major happenings. One of the mnore frequently visited useful links pages on my web site is the database section: see http://asiapac.com.au/Links/Database.htm or its mirror http://notestracker.com/Links/Database.htm

The above is all leading to the db4o open source object database for Java and .NET and specifically to point out a quite interesting article:
It says that "Besides being open source, db4o's features include an easy-to-grasp API, equally lucid persistence model, and the fact that it is a true object database (not an object/relational mapping database). Add to this its invisible transactioning and its adherence to ACID (atomicity, consistency, isolation, and durability) behavior, and you have an embeddable database that concentrates a lot of power in a small area. ... The entire db4o library is provided in a single .JAR file (or .DLL if you're using .NET), and you'll typically find yourself handling better than 90 percent of your database work with about 10 API calls." It sounds like great stuff.

The article then goes on to explain the workings of the db4o "replication" capability, and even talks about the handling of replication conflicts -- shades of Lotus Notes replication! But, of course, this is in the realm of object relational databases and with a Java and .NET programmability.

Well I'll be dammed! ... Or, Dammed Pedantry?

What is is that set one's thoughts dashing off in unpredictable directions? A short while ago I came across an article somewhere on the Web which referred to "Liars, Damned Liars, and Statisticians". If you do a Google search for this term, you'll get quite a few hits.

What annoyed me about it was that the original wording was "Lies, Damned Lies, and Statistics" and it was made famous by Mark Twain, but not necessarily of his invention. It could have come from famous British PM Benjamin Disraeli (or elsewhere), as the following articles suggest: Lies, Damned Lies, and Something About Statistics and http://en.wikipedia.org/wiki/Lies,_Damn_Lies,_and_Statistics

Switching my brain into pedant mode:
The correct term is "damned lies" and not "damn lies" -- certainly not "dam lies" as some unfortunates write.

One of my earlier laptop computers had a very nice LCD screen but I was bitterly disappointed to find that one of the pixels was dead, so that a dark spot appeared whenever there was a light-coloured background. That particular supplier had a policy of accepting a couple of dead pixels as within tolerable manufacturing bounds, so they refused to replaced the screen -- and naturally enought I made a mental note never to buy their brand again!

Switching myself into ultra-pedant mode now:
The same "faulty pixel" sort of thing happened to me regarding Gone With The Wind, a most memorable production in all respects -- except, that is, where Clark fluffed his famous line when Rhett tells Scarlett: "Frankly my dear, I don't give a damn!"

Listen to this MP3 soundclip or this WAV soundclip to hear it for yourself. Clark really should have delivered the line as: "Frankly my dear, I don't give a damn." But with cannons booming, Atlanta burning, and all the other powerful storyline events, I suppose that most people wouldn't notice this wayward placement of word stress!

Maybe it's just the way that Americans like to treat upon the English language -- anoher example of "a common language dividing nations" perhaps? The Australian (more-or-less-British) pronunciation would have stressed "damn" and not "give". Maybe it was all Clark's fault, no one else to be held responsibe!

I'm now switching off pedant mode ... In my research I luckily cam across some interesting articles, including this page of Memorable Quotes from Gone With The Wind (such as "I believe in Rhett Butler, he's the only cause I know.")

Finally, going back to "lies and statistics" again, here's a real gem: The Worst Social Statistic Ever ... Can you believe it: "Every year since 1950, the number of American children gunned down has doubled."

Richard Schwarz commented points out one aspect of some poor journalistic efforts that manage to get published. In a similar vein, even an august sorporation like IBM persisted, until about ten years ago, in data storage capacities (for disks, tapes, diskettes) with statments along the lines of "equivalent to a pile of double-spaced typed pages reaching half way from the Earth to the Moon". Why double-spaced? What font size and how many characters per line? What interline spacing and how many lines per page? From which part of the Earth to which point on the Moon, and at what stage in their relative orbits? What average thicjness of the paper sheets? ... You know what I'm getting at, don't you!\!

Wednesday, May 17, 2006

Notes Presenter -- Let's eat our own dog food


Hear ye, hear ye, enthusiastic users of IBM Lotus Notes who prepare presentations and demonstrations or who package groups of files together with documentation for any purposes ... I'm throwing down the gauntlet, spicing the pie, putting the cat among the pigeons, and maybe even letting loose the bull in the china shop!

Here's my message to all of you Lotus Notes afficionados:


Eat your own dog food

Microsoft PowerPoint or Lotus Freelance (if anybody still uses it) or the newcomer OpenOffice Impress are all excellent in their own way as specialized presentation tools. But they have only the one basic function [to prepare slides] and they fall short whenever you need to assemble multiple resources for your presentations, demonstrations and software packages.

No excuses: now there's a complementary tool you should begin using to prepare a different sort of slide presentation as well as providing a repository for all those other resources (files of all types) that are associated with your presentation or demonstration, and operating in the way classic Lotus Notes does at its inimitable best.

Just released, as freeware, is the initial Beta Version 1.0 of "Presenter for IBM Lotus Notes" -- or "Notes Presenter" for short.

DO IT NOW! Please try it out and send me some feedback, bug reports, whatever.

Notes Presenter was designed particularly with those who prepare and deliver presentations and demonstrations in mind. You can find out about Presenter and get your free copy from either


I hereby set a friendly challenge to those in the Lotus Notes user community who prepare presentations and demonstrations ...

Whenever possible, use Notes Presenter to make Lotus Notes itself the presentation and distribution medium. The challenge is put out in particular to IBM employees (especially the Lotus Software group) plus IBM business partners and ISVs (Notes/Domino application developers) who champion Lotus Notes!



Magnetic Tape is alive and kicking

CNET News reports that IBM and Fuji Photo have come up with an improved magnetic tape technology mix, see Magnetic tape prototype makes data leap and the enlarged image at Tape's storage leap and here's an IBM news release about it.

Okay, so magnetic tape isn't dead -- but perhaps the floppy disk (a.k.a. diskette) has one foot or even both feet in the grave! For those of you who weren't in the industry in the 1970s, the first diskettes were 8 inches in diameter and they were really floppy indeed, unlike the later, smaller diskettes enclosed in rigid plastic shells. You'll find an image of an 8-inch floppy amongst those at Dinosaur Sightings 3 (and if you liked those images of times past, there's also Dinosaur Sightings 2 )

Saturday, May 13, 2006

(Additional software license may be required) !!!

Over at CNET there's a new article today about how best to supply inexpensive computing to the world's masses, The $100 box? It's already here

It discusses MIT's plans for a pedal-powered laptop, and Microsoft's reaction (Bill Gates has dismissed the idea of a shared computer). The article then points out that there's already a player in this field: "While Gates and Negroponte continue to debate the form and function of their respective plans for the $100 PC, a South Korean startup called nComputing has already beaten them to the punch."

Take a look at nComputing's description of the NStation L100 ... "You can share affordable computing environment with relatively cheaper cost of ownership. Just purchasing one host computer and multiple NStation L100 as needed up to ten, you can achieve computing environment of max. eleven users at the same." There are other models, the latest being L200, for which there's a statement that you can support more than ten stations (up to 30) if the cental PC is running Windows Server 2003.

While the base station itself may be inexpensive, the $100 price is only for the base station and the overall cost will be extended by the need to have (a) a host machine, with operating system, having sufficient processing and storage capacity, and (b) for each user, a keyboard and monitor and mouse, plus the power to run the monitor. Of course, there are plenty of spare used monitors and old keyboards around, so maybe these can be shipped across to help keep down the cost per user.

The lack/reliability of electrical power in some remote locations would in many cases be a showstopper, so the pedal-powered laptop certainly has its attractions in this regard.

There's another perspective on nComputing's product at Targa Hong Kong where they brand the device as the Targa OfficeStation NetStar. And there's also the Targa Net Star Network Monitor which looks like an ordinary 17-inch LCD monitor but has the OfficeStation built in.

But what about the cental PC and its operating system -- what are the OS licencing implications? The nComputing page linked to above coyly (deliberately?) states, under the heading Simultaneous OS Operation:
(Additional software license may be required)
The parentheses are deliberately retained here, to indicate the coyness!

Ans there's the rub! I would imagine that if the central PC is running any flavor of Microsoft Windows, there likely will be the requirement for one CAL per user, which would signifcantly blow out the overall costs. Comments on this would be appreciated. Official clarification by Microsoft would be best.

On the other hand, I see that Linspire ("developer of the world's easiest desktop Linux operating system") has been busily at work and have published this press release: NComputing and Linspire Partner with Yellow Penguin to Provide Multi-User Computing Solutions to South African Customers Importantly, it says: "Linspire's Linux operating system runs perfectly on the host PC and once one genuine copy is installed, multiple users can share the operating system without paying additional access license fees." So good luck to them. I've purchased and installed the Linspire operating system, as a research project to compare a range of Linux distributions. I don't use Linspire for daily work -- I use and quite like Windows XP Professional -- and can say that Linspire certainly is packaged nicely, is easy to install and operate, and requires very little Linux knowledge at all (compared with some of the other distributions).

It's caveat emptor -- a case of "batteries not included" all over again. In short, it seems to me that it's all very well that the hardware costs can be kept quite low, but it could be that the (current) Windows licencing requirements will dictate that the central PC's operating system be Linux.

I've wandered outside my comfort zone here, any and all comments are most welcome!

- - - - - - - - - - - - - - - - - -
UPDATE: 20 May 2006
I just found Windows Server 2003 Pricing and Licensing: Frequently Asked Questions and it seems to me (personal view only, not at all a legal opinion) that Client Access Licenses (CALs) would be required for each and every device attached to the NStation that connects through to Windows. Perhaps the user is already covered for some number of CALs as part of the base Windows licensing, so would only have to pay for any connections above this number.

Friday, May 12, 2006

Back to the Future (with AJAX)

?A couple of articles about AJAX that I've scanned recently seem to me to be less than 100 percent accurate: they use up valuable typespace debating whether the acronym for "Asynchronous JavaScript And XML (often written Asynchronous JavaScript + XML)" should be Ajax or AJAX (hey, what's the accepted rule for forming acronyms?). At least one other article carelessly refers to AJAX as "Asynchronous Java And XML" (hey, I just might accept Javascript as a spelling, but leaving off the "script" part is going a bit too far).

Déja vu --I’m feeling blue! Unfortunately more and more self-styled pundits seem to be jumping on the AJAX bandwagon, so there's bound ot be a swag of muddled statements coming our way. Pass me the aspirin and anti-nausea pills ...

Luckily there are some more perceptive commentators, and one of these would have to be Yoram Meriaz who has put together what seems (to me) to be an accurate and succinct summary in Back to the Future with AJAX: The Pros and Cons of Replacing Desktop Applications with Web Applications, He looks at the pros and cons of the AJAX methodology and discusses whether or not AJAX is ready for the enterprise.

UPDATE [29 May 2006]:
Just out in AjaxWorld there's a relevant article by Alex Iskold ... The "Webification" of the Desktop: What Are the Implications for Web 2.0 and AJAX? which discusses that "There is no reason why our desktop applications cannot be web-aware."

Thursday, May 11, 2006

Comparing CRM software offerings

There are many Customer Relationship Management (CRM) software products out there in the great wide world. They run on different platforms, have vastly different capabilities, and they range in purchase cost from modest to extremely expensive. You can obtain them from retailers with little or no knowledge of how the package operates through to vendors that specialize in CRM and have vast experience in the field.

I'd class myself as a multi-specialist but CRM isn't one of my specialties. Nevertheless, I do offer a CRM product myself -- and it's freeware, no registration required, so you can't complain about the price! It's a Notes/Domino application that's designed to be simple to install, easy to operate, reasonably tailorable, and "good enough" for small to medium organizations of all types. Its name is CAPTURE, standing for "Customer And Project Tracking + Usage Reporting Extensions" -- where the "usage reporting" part indicates that it has NotesTracker built in, so that for auditing/compliance you can track all database actions (such as document reads, field updates, deletes). You can download CAPTURE from asiapac.com.au or notestracker.com and there's built-in documentation accessible via the "Help Using This Database" menu item. CAPTURE is downloaded fairly frequently, without any negative feedback, so it's worth a look!

The real point of this posting is to mention a free report that I've just examined called "Seven Questions Most CRM Vendors Are Afraid You’ll Ask: Success Secrets for Evaluating Customer Management Software." I was able to download the PDF file after registering with Onyx Software so I presume that you could get a copy the same way.

I found this 21-page report rather interesting, and would be inclined to give my free CAPTURE application a tick in quite a few of the categories -- but not all of them by any means, since CAPTURE was never meant to be all-encompassing. The fact that CAPTURE is Notes/Domino based is a big plus in its favor.

I' think that the report will prove to be quite valuable to any of you trying to make a decision about this type of software. (Naturally enough, if you decide to take up this offer from Onyx Software it's only fair and reasonable for them to follow up with you, so don't get upset by this.)

A Plain Warning

David Sless of CRIA (the Communication Research Institute of Australia) has just published an interesting new blog article: WARNING! PLAIN LANGUAGE

He starts off:
If I thought it would do any good, I would advocate that every so-called ‘plain language’ document should have a warning.
WARNING! THIS DOCUMENT CAN MISLEAD
But warnings, like many aspects of communication—including ‘plain’ language—are not all they are cracked up to be (see our recent review of boxed warnings). The simple fact, with far from simple consequences, is that communication is messy and non-predictable (not just unpredictable). Any attempt to reduce communication to a few simple rules fails. Plain Language advocates offer us a few simple rules. They are good rules and in the main, well intentioned. But neither of these are a necessary or sufficient basis for good communication.

[Editor: here's the link to the article about Boxed Risk Warnings]

And, a bit later:
I was prompted to write this particular blog because Plain Language advocates in the USA are all of a dither with excitement; there is legislation before the USA Congress requiring all government bodies to use plain language, no doubt in the belief that this will lead in due course to plain language heaven.

Why is plain language heaven as unlikely in the USA as it has been in Australia? I could give a long and complex explanation, but the basics are simple enough. There is no secret to good communication.

You really should read the entire article. Its implications spread far and wide: whether you're developing software, preparing a product brochure, creating a PowerPoint presentation, writing a user guide of any sort, or in any other endeavour involving communications, there should be something in David's article to make you think more carefully about what you're doing.

For example, I put much thought and effort into the wording of pop-up text on my web site, into the wording of error messages, into the layout and contents of user guides. Importantly, I don't leave it at that but regularly review my works of art, trying to hone them to perfection -- whatever perfection might be. Could that pop-up error message be improved? Is that user guide missing any important information, and can it be arranged better? ...

We should all do this, but with insight from David Sless's artcle we can now also contemplate whether or not our creations suffer from the "plain language" syndrome!

Sunday, May 07, 2006

What is a "service" and what exactly is SOA?

In my unfortunately neverending quest for greater understanding of things, I've been spending a fair bit of time trying to come to grips with SOA (Service Oriented Architecture) as well as the mythical -- or is it fabulous -- beast that is called "Web 2.0" these days.

You'll find quite a few good references for these and similar matters that I've assembled at http://asiapac.com.au/Links/WebServices.htm or at its mirror/backup site http://notestracker.com/Links/WebServices.htm

You could do worse than start with a nice and succinct article from BEA recently published (03 April 2006) ... SOA: Are We Reinventing the Wheel? in which Nick Simha asks "What is a service?" and examines the hype around Service Oriented Architecture, looking at CORBA, DCom, J2EE, Web services and the new kid on the block, service infrastructure.

And there's another good one also well worth a read: it's by Steve Bennett (also from BEA) ... Successfully Planning for SOA: Building Your SOA Roadmap which offers a concrete plan, along with tips and insights, to help you build an effective SOA roadmap and to help ensure the success of your SOA initiative.


UPDATE - 09 May 2006:
In his Edge Perspectives weblog, John Hagel has just written an interesting article: SOA Versus Web 2.0? in which he says:


... a cultural chasm separates these two technology communities, despite the fact that they both rely heavily on the same foundational standard - XML. The evangelists for SOA tend to dismiss Web 2.0 technologies as light-weight “toys” not suitable for the “real” work of enterprises. The champions of Web 2.0 technologies, on the other hand, make fun of the “bloated” standards and architectural drawings generated by enterprise architects, skeptically asking whether SOAs will ever do real work.
John had earlier asked if we are Ready for Web 3.0? -- which is getting ever closer to my concept of "Web Pi"


UPDATE - 20 May 2006:
Lawrence Wilkes at CBDI Forum has just penned an article about Top Ten Traits of the Successful SOA Organization


UPDATE - 06 June 2006:
Macehiter Ward-Dutton have just published some germane comments on all this stuff: SOA 2.0? Stop the madness (including some pointed comments about the IT industry analyst "profession").


UPDATE - 24 July 2006:
Steve Jones has an interesting blog about SOA and related topics. A couple of his postings are: Shooting the myths of SOA and How SOA helps you hit the agile sweet spot ... and (in case you really thought you knew what "reliable messaging" and all that "WS-*" alphabet soup is) this post: WS-RM, WS-RX, Reliable Messaging which is what?

Saturday, May 06, 2006

Our tangled web of expertise

This morning I was reading an online article about the recent findings on the importance of the enzyme creatine as the the agent responsible for Alzheimer's disease (the first time this enzyme had been found in situ in the brain, or any other tissue).

This led me to search around the Web for info about creatine, and I chanced upon the Journal of Biological Chemistry online edition. Having been a practising chemist (and high school chemistry teacher) for a while during the 1960s, before I diverted to the computer industry, I scanned a few of the abstracts just out of curiosity. For example, there's this one:

Papers In Press, published online ahead of print April 11, 2006
J. Biol. Chem, 10.1074/jbc.M601555200
Submitted on February 17, 2006
Revised on March 30, 2006
Accepted on April 11, 2006

Crystal structure of mammalian cysteine dioxygenase: A novel mononuclear iron center for cysteine thiol oxidation
Chad R. Simmons, Qun Liu, Qingqiu Huang, Quan Hao, Tadhg P. Begley, P. Andrew Karplus, and Martha H. Stipanuk -- Division of Nutritional Sciences, Cornell University, Ithaca, NY 14853

Cysteine dioxygenase is a mononuclear iron-dependent enzyme responsible for the oxidation of cysteine with molecular oxygen to form cysteinesulfinate. This reaction commits cysteine to either catabolism to sulfate and pyruvate or to the taurine biosynthetic pathway. Cysteine dioxygenase is a member of the cupin superfamily of proteins. The crystal structure of recombinant rat cysteine dioxygenase has been determined to 1.5 Å resolution, and these results confirm the canonical cupin ß-sandwich fold and the rare cysteinyl-tyrosine intramolecular crosslink (between Cys93 and Tyr157) seen in
the recently reported murine cysteine dioxygenase structure. In contrast to the catalytically inactive mononuclear Ni(II) metallocenter present in the murine structure, crystallization of a catalytically competent preparation of rat cysteine dioxygenase revealed a novel tetrahedrally coordinated mononuclear iron center involving three histidines (His86, His88, and His140) and a water molecule. Attempts to acquire a structure with bound ligand using either co-crystallization or soaks with cysteine revealed the formation of a mixed disulfide involving Cys164 near the active site, which may explain previously observed substrate inhibition. This work provides a framework for understanding the molecular mechanisms involved in thiol dioxygenation and sets the stage for exploring the chemistry of both the novel mononuclear iron center and the catalytic role of the cysteinyl-tyrosine linkage.

Wow! And I've been complaining about all of the technical terms and acronyms used in the IT industry (such as the rash of them that have arisen for Web Services and Service Oriented Architecture).

It made me ponder what would happen to us after a cataclysmic global event, such as a huge asteroid colliding with the Earth, or an all-out nuclear war: you know, the aftermath that is a favorite theme of "disaster movies" when civilization fails. It's such a "tangled web of expertise" that we have woven -- and the sum total of our knowledge expands inexorably and exponentially.

What would happen to our hard-won knowledge and skills (such as those in biological chemistry evidenced in the above abstract) after the cataclysm? Oh, it's all too disturbing to contemplate, so I'm going to give up on this train of thought and and go do something nice and mind-numbingly easy like watching the football. Do you blame me?

Friday, May 05, 2006

Business model of Web 2.0 startups

A self-explanatory web site: The Web 2.0 Business Model -- obviously very carefully researched, would you agree!

Also see my earlier post concerning Web 2.0

UPDATE - 10 June 2006:
Just about everywhere that you turn these days, you bump into some sort of article or weblog post about Web 2.0. It seems to have generated an unstoppable momentum, a sort of Web 2.0 runaway train.

Most of the commentaries are about technical and from the architectural perspective, but here's one of the relatively few about the venture capital side of things: A VC's view of Web 2.0

But one article that I'm really in tune with is by Alex Payne, The True Folly of Web 2.0 and here are a few quotes from it:

"Web 2.0 can’t see beyond itself. ... We’re failing our potential. ... We should be moving forward. We should be innovating. It’s no accident that people view the Web 2.0 sphere with the same economic, technological, and moral trepidation that they did the Dot Com era."

Thursday, May 04, 2006

How to improve a "creepy mess"

There's an interesting article over at Dr. Dobb's Portal ...

Read all about the consequences of a MESS + CREEP design in Database Design: How Table Normalization Can Improve Performance by Steven F. Lott.

It's all about database normalization and defragmentation, in case you were wondering!

Wednesday, May 03, 2006

Russ Olsen's Five Truths About Code Optimization

I've written a little bit about debugging in earlier posts, and just came across a closely-related subject ... Five Truths About Code Optimization in Russ Olsen's weblog quite a nice summary therefore most worthy of a reference here. Whether it's Java or any other development language, these five principles apply equally well.

But you'd better not visit his weblog, else you'll be enticed into spending time reading some of his other valuable posts, such as Three Languages For Java Programmers and The Same Mistakes, Over And Over not to forget The Secrets of Software Project Estimation -- so, be warned!