Monday, 20 May 2002
9:11 - I managed to finish proof-reading the galleys yesterday. I sent off the change document to O'Reilly this morning. My editor will probably gasp and clutch her chest when she sees the number of changes I asked for, but knowing O'Reilly they'll get them all incorporated. At least I hope so. I actually haven't quite finished the changes yet, because I still have to add a table and some text to the Motherboards chapter to cover the new Intel 850E and 845E/G/GL chipsets. I'm working on that now, and should have it up to O'Reilly later today. After that, I need to spend some quality time filling out the marketing questionnaire that O'Reilly sent me Friday afternoon.
Once I get all that finished and off to O'Reilly, I have to jump in to my to-do list and pay some attention to all the other stuff clamoring for my attention. I may be exceeding my all-time record for the number of balls I can keep on the floor.
Tuesday, 21 May 2002
9:18 - The galley slavery is finally over (I think). I sent off the change document to O'Reilly yesterday morning, followed yesterday afternoon by a Word document that included some re-writes and additions to the Motherboards chapter, primarily to add coverage of the new Intel D850E/845E/845G/845GL chipsets. I then started work on the next thing on my to-do list, which was a request by the O'Reilly marketing folks to answer a long questionnaire that is designed to help them sell the book.
In the interim, I was still worried about the new chipset stuff I'd added. Intel was to have announced the new chipsets (except the 845GL) yesterday, but as of yesterday afternoon when I sent in the final changes there still wasn't anything new up on the Intel site. That was odd, because if they'd taken their usual course I would have expected the data on the new chipsets to be posted sometime late Sunday night. At any rate, when I checked the Intel site after dinner, they'd finally posted the information on the 845E and 845G chipsets. Everyone who tried to write about them before the fact, including Tom's Hardware and AnandTech, got at least some minor things wrong. I spent some time madly reading through the tech documents, then updated the final rewrite and sent it off to O'Reilly.
I'm now done with the book text unless I get queries from the production folks.
Meanwhile, back at the ranch, I got some email on Pournelle's back-channel mailing list that referred me to an article in e-week that covered some of Allchin's testimony in the non-settling states action against Microsoft. Apparently, Allchin testified that some of Microsoft's software has security problems so severe that they can't be fixed and that releasing the source code would pose a danger to national security. One of Allchin's comments seems to indicate that even naming the affected modules is cause for concern.
It would be easy to talk about patriotism, refuges, and scoundrels, and that may in fact be the case here. Allchin may simply be playing the National Security card to avoid opening Microsoft source code if the States prevail, although it does seem to me that that argument is easily answered by appointing an independent, trusted expert to review the necessity for any claims by Microsoft that a particular part of the source code not be disclosed for National Security reasons.
More to the point, though, Allchin's testimony raises some serious questions about whether any government agency should be using Microsoft software at all. Or, for that matter, whether any private company with a fiduciary responsibility, whether to its customers or merely to its shareholders, to maintain data security can afford to continue using Windows. When, inevitably, private data is released to unauthorized parties and liability is subsequently determined, one measure of a company's culpability is (or should be) whether that company "knew or should have known" that their software was inherently insecure, as Allchin appears to be saying is the case for Windows.
Certainly, the fact that Windows has had and is likely to continue to have gaping security holes comes as no surprise to anyone, but the key issue here is that Microsoft appears to be admitting in sworn testimony that Windows itself is inherently insecure and cannot be fixed. If that is the case, how can any organization, whether a private company or a government agency, do anything other than migrate away from Microsoft software as quickly as possible? To take any other course may open that organization to huge liability settlements if their data are compromised. How can any company argue that they took reasonable and prudent measures to secure the data with which they were entrusted if they continue using Microsoft software? I don't know. I'm not a lawyer (Thor be praised!) but I suspect there are many lawyers just waiting to jump on this.
In effect, Microsoft is arguing that security-by-obscurity is good enough, but anyone who understands anything about software security knows that is not true, particularly for software as ubiquitous as the Windows operating systems. Even if it were true, the simple fact is that Microsoft has no obscurity. They'd like us to believe that their source code is unavailable to would-be crackers, but the truth is that thousands of people inside and outside Microsoft have access to the source code. Anyone who wants access to that source code badly enough already has access or could get it by fair means or foul. So in effect what Microsoft is saying is:
a. From a security standpoint, Windows is too badly broken to be fixed.
b. As long as our source code remains undisclosed, everyone is safe
c. Oh, by the way, there are thousands of people inside and outside the US that have access to part or all of our source code.
From that, it would appear that the inevitable conclusion is that Windows is too badly broken to be fixed, that none of us are safe, and that the only responsible decision is to migrate from Windows to a more secure operating system as soon as possible. Some have suggested that one possible answer would be for Microsoft to convert Windows to Open Source and allow hordes of skilled OSS programmers to fix it. I'm afraid that wouldn't work, though. By all accounts, Windows is a complete mess, so I'm afraid the same thing would happen that happened with Mozilla. The original plan was to use Netscape code as the basis for Mozilla, but after a time-consuming false start, the Mozilla group finally reached consensus that it would be easier just to scrap the Netscape code and start from scratch.
So it seems that by Microsoft's own sworn testimony Windows is fatally flawed. Or am I missing something?
Wednesday, 22 May 2002
8:58 - I got an interesting email yesterday. It made me stop to think about something that has become common wisdom, the idea that Windows and Windows applications are victimized by viruses and other exploits just because they are so dominant. I decided that that really didn't explain the problem, so I replied in some detail.
Well, Willie Sutton once famously didn't say that he robbed banks because that's where the money was, but the point is valid nonetheless. Or at least it seems to be at first. Certainly the plethora of viruses, worms, and Trojans are directed at Windows desktop operating systems and applications, but I think that's less because of their ubiquity and more because of their inherent insecurity, both by design and by default configuration.
Microsoft has historically been more concerned with ease-of-use than with security, and despite their recent lip service toward security, that remains largely true. Why else would they still be shipping their operating systems and applications configured by default to emphasize convenience at the expense of security? Why are such hideous security holes as Windows Scripting Host even installed by default, let alone enabled by default?
The truth is that there are tens of millions of desktop systems running operating systems other than Windows, including Apple operating systems and Linux, and yet those have been largely unaffected by the plague of viruses and other exploits. Although one might argue that the lack of exploits is because those other operating systems make up only 10% or so of desktops, I think a more reasonable explanation is that it's because those other operating systems don't make it trivially easy to create such exploits, as Microsoft operating systems and applications do.
But that's not the real issue. As aggravating as viruses and worms and Trojans on desktop operating systems are, the server is the real Holy Grail among crackers trying to gain access to data. In server space, Microsoft is just another player, and not even the major one. Think back to the many times you've read about exploits against servers. How often was the server being exploited running Microsoft Windows, IIS, or another Microsoft server application? Almost always. How often was it running UNIX, Linux, NetWare, Apache, or another non-Microsoft operating system or application? Very seldom. And here you can't blame Microsoft's large market share because Microsoft is usually the second, third, or lesser player in any given segment.
I don't think we'll ever be free of the vandals who create viruses, but neither do I think that anything other than Microsoft's pathetic security can ultimately be blamed for how common such exploits are today. It's easy to moan about security holes, and to ignore them for weeks, months, or years (as Microsoft has done), or even to blame the person who discovers a hole (as Microsoft has also done). But I notice for all the talking Microsoft does about security, they actually don't *do* much about the problems. With Linux, on the other hand, when a problem crops up it's almost immediately fixed, usually within a few days and sometimes even the same day it's announced.
In effect, Linux and other Open Source Software undergoes peer review on a massive scale. That's proven to be the best way to ensure secure mainstream operating systems and applications, and may in fact be the only way to do so.
Speaking of Linux, FedEx showed up yesterday with Red Hat Linux 7.3 Professional. I already have one box running Red Hat Linux 7.3 in Workstation mode. I think when I get a spare moment, I may strip that box down to bare metal and bring up a Red Hat Linux 7.3 system in Server mode. My servers now run Windows NT Server 4.0. I'm not really too worried about security, because my firewall is adequate to stop any likely attempt to crack my systems. But I did make a decision some months back that I wouldn't bring up any more Microsoft servers, and that the next server I installed would run Linux and Samba.
What I found interesting about the Red Hat box is that it's stamped "Eval Copy - No Support Included". That's a departure from the usual situation. When a company sends me eval software, they're usually anxious that it work for me, and so they typically include not just the standard support services but often a private telephone number that is answered by their top-tier support folks or even the software developers themselves. It's interesting that Red Hat not only didn't do that, but explicitly disclaimed providing any support for the product at all. Perhaps they are very confident that their product won't require any support.
Oh, well. I'd better get back to work.
Thursday, 23 May 2002
8:58 - Interesting article in the paper this morning about flat-rate long distance and how it's the coming thing. I predicted the coming end of per-minute long distance billing years ago, but now it seems that it's finally coming to pass. The article blames the decline in per-minute long-distance usage on the Internet and those cell phone plans that bundle huge numbers of long-distance minutes, and predicts that within five years per-minute billing for long-distance service will disappear. As I said three years ago or so, that makes sense, because the long-distance part of a connection is the cheap part. It's the connection between the telco central office and the subscriber's home or office that costs real money to build and maintain.
The only remaining problem is that the flat-rate long-distance programs are grossly overpriced. AT&T, for example, charges $20 a month for all-you-can-eat long distance service for calls made to other AT&T subscribers (and 7 cents/minute for calls to non-subscribers). But that $20 is probably more than twice what the average residence pays for per-minute long-distance service now. Market forces should take care of that problem, though. I expect to see that charge drop to $10/month within a year or so, and eventually to $5/month. At some point, we'll probably see local and long-distance calling merge into a unified phone service plan that charges extra only for calls to foreign destinations. That'll be an interesting change, because at that point the long-distance carriers will no longer have us as customers. Instead, their customers will be the local exchange carriers.
As to how much change that'll make in calling patterns, my guess is not much. Years ago, the cost of long-distance calls was a factor, but that hasn't been true for a long time. If when I woke up tomorrow morning long distance calls were "free", I probably wouldn't make any more long-distance calls than I do now, nor would I talk for any longer. The same is probably true of most people. Certainly, phone companies could expect to be swamped until the novelty of "free" long-distance calling wore off, but that probably wouldn't take long.
I spent some time last night reading through the various manuals that come with Red Hat Linux 7.3 Professional. In absolute terms, they're nothing special, but they're about five orders of magnitude better than the documentation that comes with Windows. It looks like I should be able to set up a Linux server that could replace my NT Server boxes without too much difficulty. That's a project for later, but it will happen. I wonder what to do about all these Windows NT Server licenses and CALs I'll then have sitting unused.
And here's an interesting take on the security issues that have been getting a lot of attention the last few days:
I don't know. I've been keeping a close eye on reports of Linux security problems, and I haven't seen all that many. You may be right, though. I haven't written a line of serious code in more than 20 years, so I'm out of touch. It seems to me, though, that Linux has enough eyes looking at its source code that most vulnerabilities have probably been eliminated and those that remain are likely to be fixed quickly as they're uncovered. I wish I could say the same for Windows.
I've gotten several queries over the past months from people who use Windows XP and whose CD-ROM drive and/or CD-RW drive have "disappeared". That is, they show up in the BIOS boot screen, but Windows can't see or access them. Microsoft has a KB article here that explains the problem. It will come as no surprise to anyone who knows my opinion of Adaptec/Roxio Easy CD Creator and DirectCD that those programs are involved. To quote Microsoft:
Which seems a long way of saying that any version of Easy CD Creator or DirectCD can cause the problem. The fix is to delete a couple of registry keys, although the ones you need to delete aren't intuitive. The better solution is not to install Adaptec/Roxio CD burning software in the first place. Friends don't let friends use Adaptec/Roxio Easy/DirectCD.
Friday, 24 May 2002
9:21 - Mozilla 1.0RC3 has been released and is available for download here. I downloaded it this morning and installed it. I was surprised to find that the download was, if anything, faster than usual. The 10 MB download took less than a minute. Perhaps I just got connected to the ftp server before all the /. readers got their first cups of coffee.
Not surprisingly, RC3 looks just like RC2. I'm hoping that they've fixed the keyboard focus problem, but I'm not holding my breath. While it still doesn't have the speed or feature sets of IE or Opera, Mozilla is coming together as a first-rate browser. Although I'd miss a lot of the features of Opera, I could use Mozilla as my primary browser, if only they'd fix the keyboard loss-of-focus problem.
Interesting news from Holden Aust:
Interesting news, indeed. I read your first message last night, and went over to read some of the material posted by Sony. It struck me then that this wasn't a particularly significant event as far as Linux qua Linux. After all, running a Linux GUI workstation in 32 MB of RAM (not expandable) doesn't get anyone much forrarder. It was in reading the FAQ that I noticed the truly significant news, which your second message covers. I've already read some complaints about Sony "selling" LInux (as if Red Hat doesn't), but it struck me that $200 isn't an outrageous price for a proprietary 40 GB hard drive, keyboard, mouse, cable, and a copy of Linux (not to mention 6/7ths of the Sony PS2 documentation on disc and a run-time copy of the development environment). I could be wrong, but my guess is that Sony will sell at least a million of these kits, which should make for an interesting change in the landscape.
Is it just me, or is Linux really on a roll? I expect to see a lot of Linux and OSS coverage in many of the journals I read, but it seems to me that Linux is truly breaking out into the mainstream now. I see articles in the morning newspaper and other mainstream publications talking about Linux in a very favorable light, and even putting it forward as a reasonable replacement for Windows.
It also seems that Linux is gaining mindshare among regular people. A year or two ago, an article about Linux in a mainstream publication would have spent some time explaining what Linux was. Recently, mainstream articles seem to assume that a general readership already knows what Linux is. And the articles themselves are changing. A year ago, mainstream articles typically treated Linux as an experimental operating system, fun to play with and free, but not a serious alternative to Windows. Now, I'm seeing mainstream articles about Linux "wins" in school systems, government offices, and so on. Mainstream publications, with some exceptions, are no longer marginalizing Linux. That can't be good for Microsoft.
10:13 - What is it with this Nigerian Scam thing? I had been getting one such spam every day or two. In the last week or so, I've started getting a dozen or more of them every day, six so far just this morning. There can't be all that many people stupid enough to fall for this scam and yet smart enough to have a computer and email, so perhaps the scammers are running out of marks and cranking up the spam machine in a desperate attempt to find more marks. The South African cops arrested a half dozen or so of these scum the other day, but clearly there are more of them still scamming. I wish someone would hunt them down and kill them. If nothing else, they're giving the entire country of Nigeria a bad name.
Thanks to Greg Lincoln, I now have password-protected directories working again. That means the Subscriber Page is accessible to subscribers again (although there's not much there at present). Speaking of Greg, if you have any interest in Linux and haven't checked out Greg's and Brian Bilbrey's LinuxMuse, I recommend you do so. They're just getting started, but I've already found several articles there that provide in-depth coverage that I haven't found elsewhere.
Saturday, 25 May 2002
8:24 - The bad news is that Mozilla 1.0RC3 doesn't fix the keyboard problem I've been complaining about for two or three months now. The good news is that I've finally found confirmation by someone else that the problem does exist. I've gotten messages from several people telling me that the problem I've described with the keyboard losing focus doesn't exist, that it can't possibly exist because anything that blatant would have been found and fixed by now, and that I'm just being mean to Mozilla.
As I've said repeatedly, I do like Mozilla, and the problem is not something I'm imagining. It occurs on three or four systems, and has survived unfixed after several Mozilla upgrades. But you don't have to take my word for it. Here's what someone else says in Bugzilla Bug 142901: New windows won't "focus" if opened when all other Mozilla windows are minimized:
And it's actually worse than that. If I open a "dead" Mozilla window, for example, I can click on an entry on my Links page, say Google, and Google in fact loads. The problem is that not just the URL box is dead in Mozilla, but the keyboard itself is dead. That means when I click to the search box on the Google page, I get no cursor and nothing I type is entered in the search box. Mozilla is, as I said, simply not accepting any keyboard input.
If this report correctly analyzes the cause, what made me subject to this bug is that I often have (many) browser windows minimized and that I almost always browse with my browser maximized. I'm not entirely sure that this guy has the cause figured out correctly, but I have noticed that when I start a new Mozilla instance with another instance not minimized the problem does not occur. Or perhaps I should say that it hasn't yet. Unlike the person who reported this bug, I don't find it reproducible. That is, sometimes when I have one or more instances of Mozilla minimized and start another instance, the browser comes up and works fine. Other times it doesn't. So I don't know for sure whether having another Mozilla window open allows the new instance to open normally, or if I just haven't been bitten yet by Mozilla's unpredictable loss of keyboard focus.
I still find it incredible that a bug this obvious and important has survived all the way into RC3, let alone that the severity of the bug is categorized as "minor". Surely a bug that causes an application to stop accepting keyboard input under common conditions should be critical. Oh, well. Perhaps they'll get it fixed by the time 1.0 actually ships.
Sunday, 26 May 2002
9:15 - Poor Jerry Pournelle. He sent out a mailing to his subscribers yesterday and somehow copied the distribution list to the CC line instead of the BCC line in Outlook. So now all of his subscribers have the email addresses of all his other subscribers. I haven't counted, but there're probably 1,000 or more names on the list. He immediately sent out an apology, although I can't imagine that too many people would be upset at the mistake.
It seems to me that software should catch probable errors like this. When Jerry clicked Send, I'd expect Outlook to pop up some kind of warning message: "You're about to CC this message to more than 1,000 people. Are you sure you don't want to BCC it instead?" After all, how often does anyone want to CC a message to 1,000 people? Surely 99.99+% of such messages are intended to be BCC'd. So why doesn't the software catch that?
Speaking of software, I find that I'm using Open Source stuff more and more. Mozilla is my default browser, and I think I'm going to replace Word and Excel with OOO. I don't make great demands on a word processor or spreadsheet, and from what I have seen so far, OOO is more than Good Enough. It also has the advantages of being Open Source and cross-platform. The only thing holding me up is that a few things I do require Office. For example, the O'Reilly author's template is for Word 2000. I've already emailed Tim O'Reilly to ask if they're going to release an OOO version of the author's template.
Open Source is making great inroads on my software Top Ten list. On my Windows boxes, the only major applications missing are replacements for Outlook 2000 (too bad Evolution isn't available for Win32) and FrontPage 2000. And the only reason I'm using FrontPage is for its site-management features. I'm as likely to edit an existing page in Mozilla Composer as in FrontPage Editor. Both of them occasionally butcher absolute versus relative links, so the danger is the same in using either.
I'm quite pleased at the progress OSS applications are making. It may not be a year until I have a little stuffed penguin on my desk after all.
Copyright © 1998, 1999, 2000, 2001, 2002, 2003, 2004 by Robert Bruce Thompson. All Rights Reserved.