9.30.2006

Berners-Lee, Gödel and Turing

What do Tim Berners-Lee, Kurt Gödel and Alan Turing having in common? I'm not entirely sure. But that's the title of a newly arrived book, "Thinking on the Web: Berners-Lee, Gödel and Turing," by Peter Alesso and Craig Smith. I'm intrigued.

The back cover says:

Tim Berners-Lee, Kurt Gödel, and Alan Turing are the pivotal pioneers who opened the door to the Information Revolution, beginning with the introduction of the computer in the 1950s and continuing today with the World Wide Web evolving into a resource with intelligent features and capabilities. Taking the main questions posed by these thinkers—"What is decidable?" by Gödel, "What is machine intelligence?" by Turing, and "What is solvable on the Web?" by Berners-Lee—as jumping-off points, Thinking on the Web offers an incisive guide to just how much "intelligence" can be projected onto the Web.

One of the benefits of being a technology journalist/analyst is that books like this show up, unannounced, courtesy of publishing companies (in this case, Wiley), who hope that I'll review it. Dozens of titles show up on my doorstep each month; a few get kept, but most are donated to a local junior college library. This one looks interesting; I'll read it on my next plane trip, and let you know what I think. If you've already read it, feel free to beat me to it and post your own comments.

More on Borland

My colleague Larry O'Brien has weighed in regarding Borland's moves to rename/reposition/rejigger its Core SDP products into a new set of application life cycle suites. One upon a time, Larry was one of the biggest and most loyal Borland supporters imaginable, but his faith has waned, and waned and waned, and now it has waned some more.

His current blog posting is "Borland Gives Up On Core SDP: I Wonder How Much That Cost 'Em?", and it references one of his older SD Times columns from 2004, "Only Nixon Would Go To China." Both are worth reading.

9.29.2006

Borland's Core Dump

Borland has a new application life cycle management strategy. The company, which has been undergoing a radical shift since the departure of CEO Dale Fuller last November, is moving away from its role-based Core SDP ALM solution. Instead, the company is releasing a new line of tools which are more function-based, called LQM.

This strategy makes sense. Core SDP, which the company had flogged continuously since March 2005, divided software developers into four different roles: analysts, architects, developers and testers. Different ALM tools within Borland's product line were assembled into four suites -- called Core::Analyst, Core::Architect, and so on. Companies would then license the appropriate suites for their developers, everything would interoperate, and software would be written.

Borland's role-based approach is far from unique. The two big bananas of the software tools market, IBM and Microsoft, have similar role-based focus within their IBM Rational and Visual Studio Team System solutions.

The problem is that few companies divide out their world that way. Different people play different roles at different times. Not every company defines the roles the same way, or uses the same terminology, or even wants the same subset of tools for developers within those roles. In short, it was a good idea, but not good enough.

Thus Borland's new strategy, which still the company's tools into four piles -- but by function, not by developer role. So, there will be a suite for quality management, one for IT management and governance, one for requirements definition and management, and the fourth for change management. Developers would select the building blocks that they need. Or that's the plan.

Note that Borland is still selling the same individual tools, like the SilkCentral test management software, or CaliberRM requirements manager, or the newly acquired Gauntlet test-automation software. But they're being assembled in a more rational way.

Alex Handy got the scoop on all this in this week's News on Thursday newsletter, and we'll have a fuller report on it in the Oct. 15th issue of SD Times.

Hewlett and Packard

The ongoing drama at Hewlett-Packard has me rapt with attention. Beyond its involvement with Mercury (which HP is in the process of buying), the corporate-spying scandal doesn't have much immediately relevance to my own world of software development. However, it is a fascinating tale, and it's interesting to watch it unfold.

Certainly friends who work at HP are equally focused. Morale at the company is bad, like when Carly Fiorina was laying off people left and right in the wake of the Compaq acquisition. Not good, not good at all.

Within this broad story, there's room to enjoy what (to me) is a perennial issue: What do you call the company?

Legally, the firm keeps changing its name. It used to be the Hewlett-Packard Co. But nowadays, it's the Hewlett-Packard Development Company, L.P.

What about shortening the name? There we have lots of options. I use HP, as you can see. That's also the company's preferred usage. However, others also use H-P, H.P. or my personal favorite, simply Hewlett.

Nobody, but nobody, in the tech industry calls Hewlett-Packard, "Hewlett." That's reserved for out-of-touch mainstream journalism, such as the New York Times headline for today's story about the hunt for the leaker. The Times also uses the abbreviation H.P.

The Wall Street Journal likes the abbreviation H-P, even in headlines, but calls the company Hewlett-Packard Co. on first reference within the story.

Our style at SD Times and other BZ Media publications is to follow a company's own preferred usage, whenever possible; we only make occasional exceptions, such as when companies have gratutious punctuation as part of their name. We drop the exclamation point (which in journalism is called a "bang") from names like Yahoo!, for example, because it's disruptive when you're trying to read.

But I still get a kick out of seeing the first ref to HP written as simply "Hewlett." It always takes a moment to figure out who the NY Times is talking about. C'mon, guys. You can do better than that.

9.28.2006

Feed me!

A little knowledge is a dangerous thing, and I'm at that stage of my nascent blogging career. Two friends, upon hearing about my blog, suggested that I add a link to its XML feed to the blog page.

Sounds easy, I thought. The software supports syndication feeds, and there's a convenient CSS stylesheet. How hard can that be?

It only took a few minutes to find the right spot in the stylesheet and insert the link and descriptive text. Which I spelled wrong. (Did any of you see the site during the half-hour with the glaring error?) Then I found a nice graphic that says RSS and inserted it instead of the descriptive text. Nerdvana.

Only to be told by one of my friends, Larry O'Brien, "Yours is not an RSS feed but the competitive Atom format." That's embarrassing. But now we have a working syndication feed and the right graphic. I hope.

Zilog enters the 16-bit market. Again.

My first exposure to microprocessors came through the use of the Zilog Z80 chip. It was hard to do any work with small computers in the late 1970s and NOT use the eight-bit Z80; they were relative cheap, easy to build circuits with, and simple to program. Many hardware and software engineers, including myself, cut our teeth on Z80 assembly. Early microcomputers, like the Radio Shack TRS/80 and numerous CP/M boxes, used the Z80 before the IBM PC came out and redefined the landscape around Intel's x86 family.

But Zilog, what have you done for us lately? Not much, given the company's recent financial woes -- millions of dollars of losses every quarter. While the company still sells variations on its newer eight-bit Z8 microprocessor, it also offers other stuff like infrared controllers. Even so, the breadwinner is the Z8, which includes onboard flash memory, perfect for embedded microcontroller applications.

Zilog has lived in an eight-bit world for 30 years. Yes, the company has attempted to break out of the eight-bit box before, such as with the short-lived 16-bit Z280 and 32-bit Z380 processors from the early 90s. But they just didn't go anywhere.

Flash forward (so to speak) to 2006. One month ago, the company dumped its chairman/CEO, Jim Thorburn, who had been in place for five years, appointing an interium CEO while beginning a search for a permanent replacement. And now it has released a new 16-bit platform, called ZNEO.

ZNEO looks like an interesting chip, and a possible upward migration path from the Z8 microcontroller. It has fast zero-wait-state internal flash memory, in a variety of sizes ranging from 32KB up to 128KB: that's a lot of space in the embedded market. Plus, it has a math engine that can do 8, 18, and 32-bit operations.

Clearly, the ZNEO project is going to be critical for Zilog, as the firm struggles to survive. It's a chicken-and-egg situation: Zilog needs new customers and design wins in order to get its finances in order. But will embedded developers, who perhaps rely upon the company as a provider of tried-and-tested commonity chips, want to base their future products on an unproven platform from an trouble supplier? Time alone will tell, but I have my doubts.

9.26.2006

The Itanium Solutions Alliance...

Tomorrow I'm heading up to San Francisco for the second day of the Intel Developer Forum. I've received many meeting invitations for IDF, but have been struck by the paucity of news or announcements that would apply to software developers. The bulk of the third-party announcements have focused on storage and wireless networking. As Intel gets farther from its roots in CPUs and developer tools, the less relevant much of their ecosystem becomes, at least for me.

One of today's announcements, one of the few that I found interesting, involved the use of dual-core processors in embedded applications. Dual-core processing will have a significant impact on the embedded/device development market, which has traditionally deployed single processors with single cores. In a hard real-time environment, using a high-end RTOS from companies like Wind River or Green Hills, or one of the hardened versions of Linux, applications must be both tight and deterministic. How well will that play in a dual-core environment, where you have multiple hardware that threads that won't be synchronized? It should be less of a problem than with dual discrete CPUs -- but it's going to be in issue nonetheless. I look forward to learning more.

Something that I don't particularly want to learn more about is what's happening with the Itanium processor, which could be fairly characterized as an increasingly niche product. Sure, the vertical scalability of a high-end Itanium 2 processor can be impressive, but the world belongs to the 32-bit and 64-bit x86 processors from Intel and AMD. While RISC will still play a role, particularly with Sun's SPARC processors, Itanium is destined to remain the poor stepchild, relegated to specific applications, like big honkin' databases. And there's nothing that the Itanium Solutions Alliance -- a vendor consortium set up by Intel to promote the processor -- has done to change my mind about that.

9.25.2006

Cyberfortress vs. low-hanging fruit

My 9/21/06 "Zeichick's Take" about automotive security brought several letters-to-the-editor, one of which made an excellent point that applies well in the physical security world, but which in my opinion falls down in cybersecurity.

Steve Brewin wrote,

"Apparently the vast majority of crime is committed by amateurs chancing on an easy opportunity. The simple lock removes the easy opportunity, amateurs will look elsewhere. Professionals play for much higher stakes and while they can easily bypass such simple security mechanisms, the probability of an attack from them is massively less. Most targets are not worth their time. For most, the cost of installing systems capable of thwarting their attacks is disproportionate to the risk.

The insurance assessor explained that while most viewed their offer as a marketing exercise, their statistics told them that the discount they were offering was small compared to what they expected to save in the cost of claims alone.
"

That's very true. Professionals can unlock cars, remove The Club, jimmy house doors open, even break a laptop security cable with a small bolt cutter. So can a determined amateur, who can pick up the right tools, or practice simple techniques. But what about the casual amateur? The kid walking through the parking lot who sees an brand-new iPod sitting in a car? For that kid, a locked door may sufficient to make him move on.

In other words, if someone is bound a determined to steal YOUR things, locks probably won't help. But if they're just looking to steal SOMETHING, they'll pick the lowest-hanging fruit. Your security system simply ensures that your fruit isn't the lowest-hanging target.

But that falls down when it comes to cybersecurity -- because of the shotgun approach. Even the most casual script kiddies use sophisticated ports scans, SQL injection, worms and other automated techniques. Those are the equivalent of trying to break into every car in the parking lot simultaneously. I'm not sure that hoping that someone else's computer is a lower-hanging target is enough. It's unfortunate, but our neworks, servers, desktop AND applications have to become fortresses. At every layer of the stack, we're being targeted.

9.24.2006

2007 editorial calendars are up!

(Cue the trumpet sound effects) The 2007 editorial calendars for SD Times and Software Test & Performance are now avialable.

For those of you who don't follow such things, a magazine or newspaper's editorial calendar provides insight into some of the feature articles that the publication will cover during the next year. It's traditional for them to come out in September or October. Edit calendars often also provide information for advertisers regarding the cutoff dates for reserving ad space and for delivering ad materials. Edit calendars are used by writers, advertisers, and corporate communications professionals.

It's important to note that edit calendars are always subject to change without notice. While we do our best to predict the long-lead stories for our publications, software development is a fast-evolving industry. So, you might want to bookmark the editorial calendars, and check back every few months to see if they've changed.

There's one 2007 edit calendar still to come, for Eclipse Review. We'll post that in a few weeks.

9.23.2006

Eclipse bugs resolve later...or never

One of the challenges for any software development project — whether enterprise or for-sale, open source or not — is what to do about all those pesky defects that nobody’s going to fix. Why aren’t they going to be fixed? It might be that they’re not a show-stopper, or that there are other priorities, or there’s no easy fix, or simply that nobody wants to do it.

Every non-trivial software project has bugs that won’t be fixed. Sometimes you know that it’s not going to be fixed, and at other times, everyone has the best of intentions, but it just never gets done.

One of the benefits of most open source projects is transparency. Take Eclipse, which uses a public bugzilla feed to let users and contributors report defects. When defects are reported, often they’re resolved, but sometimes they’re marked RESOLVE LATER. Does that mean that the issue truly will be resolved later, or as some people suppose, is that a polite euphemism for RESOLVE NEVER?

Let’s face reality: Not every bug is going to be fixed. Yes, it would be nice to have less ambiguity, and to know, for certain, that a specific bug is going to be (or not going to be) addressed. But at least with a system like the RESOLVE LATER system, you can see if action is being taken or not. With non-open-source projects, or OSS projects that take place with less transparency than Eclipse, bug reports go into a black hole.

By contrast, with commercial software, a bug will only be fixed if the software owner sees the business value of fixing it. While I agree that RESOLVE LATER is suboptimal, it’s easy enough to see that a bug that’s been ignored for months or years isn’t going to be addressed. And that’s valuable information.

Goodbye, Patricia

The HP spying investigation is getting stranger by the day. When the company reported that its chairwoman, Patricia Dunn, was going to step down as of January, many of us knew that wouldn't hold -- she had to go, and she had to go now. Only a few days later, after more revelations, she resigned effective immediately on Sept. 22.

But what about the new chairman, CEO Mark Hurd? He's presided over a remarkable turnaround; HP's fortunes and reputation have improved tremendously since he took over from the disastrous Carly Fiorina. It would be a significant blow to HP were he to be forced out due to this scandal -- but that's a real possibility, given numerous reports that Hurd was in the loop regarding Dunn's espionage on board members and journalists.

Indeed, as reported in this Fortune story published that same day, Hurd admits to having known that HP was involved with questionable activities. Isn't it his job to intervene? It doesn't look good for Hurd, and it doesn't look good for HP.

9.22.2006

Greetings, Earthlings!

Welcome to my blog. It had to start somewhere, and this is where it starts. And the trek had to start sometime; it should have started a long time ago, but it didn't, so here we are.

This blog will be a spot to discuss topics of professional and personal interest to me, mainly focused on the realm of information technology, focusing on software development, security, enterprise computing, and the like.

Because blogs are better with links, I'll start with one of my own, detailing my most recent experiences with an oil change, and how it relates to software security and script kiddies. This is one of my weekly columns, called "Zeichick's Take," found in SD Times' e-newsletters.

About Me

My Photo
Co-founder and editorial director of BZ Media, which publishes SD Times, the leading magazine for the software development industry. Founder of SPTechCon: The SharePoint Technology Conference, AnDevCon: The Android Developer Conference, and Big Data TechCon. Also president and principal analyst of Camden Associates, an IT consulting and analyst firm.