photo-rbt.jpg (2942 bytes)

Email Robert

Daynotes Journal

Week of 24 January 2000

Sunday, 30 January 2000 08:55

A (mostly) daily journal of the trials, tribulations, and random observations of Robert Bruce Thompson, a writer of computer books.


wpoison

 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Jump to most recent update


Monday, 24 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


School is open today, although delayed by two hours. Barbara says the roads aren't all that bad, and she's off on errands and a gym visit. We have more snow forecast for Wednesday, which means the grocery store shelves will start emptying rapidly. Wednesday is her usual shopping day, so to avoid the rush she's also going to stop by the store to buy Coke, milk, and other essentials.

I see that the judge has ruled in favor of the movie industry with regard to DeCSS, ignoring the fact that it's impossible to unexplode a firecracker. EFF will appeal, of course, but the damage has already been done. If you're something the size of the movie industry, you can roll over the rights of little guys, and there's nothing they can do to prevent it. It's almost enough to make honest people want to go out and start pirating DVDs. I won't bother, because I regard watching movies as a waste of time. In the two hours or so that it takes to watch the best movie ever made, I could read even a mediocre book and be the better for it. I started to watch a tape of Arnold Schwarzenegger's Predator last night. I hadn't seen it, and it got very good reviews. What a waste of time. I gave up half an hour into it and started reading a mystery. I can't even remember the title or the author, but it was certainly a more enjoyable use of my time than watching a ridiculous movie.

The movie and music industries are fighting a losing battle, and I can't wait to see them lose. Of the two, I have much more contempt for the music industry. The movie industry at least creates the product. The music industry creates nothing. They're leeches, sitting between the content creators and consumers. They're doing everything they can to make sure they can continue to grab the lion's share of the profits, but MP3 will ultimately do them in. I once suggested, only half in jest, that someone who wants the latest Tom Petty shouldn't buy the CD. Instead, he should pirate the CD and then put a dollar bill or two in an envelope and mail it to Tom Petty. Tom Petty ends up making more money that way and you get the CD cheaper. The only loser is the music industry, which richly deserves to go bankrupt. Morally, you've taken the high ground, although legally you're still a felon. I use Tom Petty as an example because he actually tried to post some of his music as MP3s on his web site, but was stomped down by the music industry. Bastards.

* * * * *

-----Original Message-----
From: Chuck Waggoner [waggoner at gis dot net]
Sent: Saturday, January 22, 2000 8:00 PM
To: webmaster@ttgnet.com
Subject: More on Digital Music

I'm not a lawyer either, but I'm becoming suspicious about whether even copyright attorneys are either hedging their bets, or are just wanting their clients to play things very safe. Here's the latest item, one that relates somewhat to your mention of backing up one's CD music.

My friends working in radio report that a little cottage industry had been developing that involved music directors with large record collections from which to work, who were copying their tunes to hard drive and supplying pre-programmed 'rotations' of music for use with the hard drive based, digital audio computer automation equipment that has taken the radio industry by storm.

Most radio stations already own either the vinyl or CD's of records they play, so it was assumed that this method of supplying the music to the stations was within the law. After all, most of the songs radio stations play are either given to them for free by the record companies and distributors, or they are purchased at a special discount which lowers the price to an insignificant matter of pennies, meant only to recompense basic costs in the distribution channel, and therefore--even if radio stations didn't own their own copy of the music--their real obligation is in paying public performance fees to the music licensing companies, usually based on what they broadcast, and normally determined by a yearly sampling period of a week or two. So, no one in the process seemed deprived of any revenue they would ordinarily get by these new music programming enterprises.

But recently, this little industry has been squashed by the expanding MP3 flap. The folks providing this new service have been informed by lawyers that, in order to be legal, they must use the actual vinyl or CD recordings owned by the radio station to which they are supplying the programming. That basically put this new industry out of business before it ever really got off the ground. Makers of the automation equipment, which also had been supplying the same service as an inducement to buy their hard drive playback and automation products, have also ceased providing it.

Apparently, the battles over control of digital music are nowhere near over, but are expanding, instead. I'm told that this is also influencing how large multiple station radio groups are functioning. Until recently, stations under common ownership that played "oldies" or "big band" music, were trading or centralizing their music sources in order to expand the library available to each station. Now, even that is being brought into question--and some of those records are rare: finding one copy is often difficult, let alone a copy for each station in the chain that might play it. Although this has been a well-known and long-standing practice, never before was it challenged, until stations began playing music from the hard drive.

If I understand what you're saying, someone was copying music and selling it to purchasers who already owned a copy of that music in another form. If so, that's quite different from what I was talking about, which is one person making copies, whether from his own copy or a borrowed copy, for his own personal use of music that he already owned. But I agree that copyright law is confusing.

* * * * *

-----Original Message-----
From: John Dougan [mailto:jdougan@bigserver.com]
Sent: Sunday, January 23, 2000 6:24 AM
To: webmaster@ttgnet.com
Cc: John Dougan
Subject: W2K Performance

You may not be so far off with your observation on W2K performance (or lack thereof). According to ZD net and at MS's own web site:

So it looks like W2K can actually be slower on machines with more than 32MB of RAM. Then you add in ActiveDirectory turning off the disk caching, (to ensure data integrity) which will slow things down lots more. According to Novell this can drop transfer rates from 7.5 MB/sec down to .3 MB/sec:

I know that they are more than somewhat biased, but the numbers they quote for caching vs. non-caching are pretty close to what I remember from other sources.

Thanks.

* * * * *

-----Original Message-----
From: John Sloan [mailto:john.sloan@sympatico.ca]
Sent: Sunday, January 23, 2000 9:29 PM
To: webmaster@ttgnet.com
Subject: Slow cards in Solitaire and what they really mean...

Hi,

I found your site from a reference at Chaos Manor. It looks very interesting and useful. Thanks!

I noticed your comment about the speed of video drivers in Win2k. I think you're mistaken, and what really happened was that MS changed the Solitaire game in Win2k. There are at least two changes:

1) The cards do fall a lot slower when you have finished a game. 
2) If you right-click, then all the cards that can be put on the stacks in the upper-right will be sent there. This really helps lower the total time for finishing a game.

To prove this, I ran the "sol.exe" program from an NT4 installation on my Win2k system, and when the cards fell, they fell in about 2 seconds. This is whatI expected from that version of Solitaire on NT4, and this was running on Win2k Gold. As a crosscheck, I ran the "sol.exe" from the Win2k installation and the cards fall very slowly when I win a game.

I have played MS Combat Flight Simulator under NT4 and Win2k Gold, and the speed of the game's video is about the same. Because of this, I am pretty sure that the correct video drivers are being installed for my video card (Creative Labs Graphics Blaster Riva TNT).

As far as overall speed, I find Win2k Gold faster than RC2, and I notice that I do a SETI@home "work unit" from the SETI@Home screen saver in less time than under NT4 on the same hardware. My system is a PII-350 with 128MB of RAM and a SCSI disk, and I expect that different systems will get different results for Win2k.

One thing that is really strange about Win2k is the way it deals with BIOS settings. Under NT4, the BIOS was ignored after boot, but hardware that was configured by BIOS settings was usually left alone by NT. Win2k does something entirely different. I found this out because when I had NT4 running I had some problems with IRQs, and so I had the IRQ for each PCI slot separately defined on each slot and the USB was disabled to save an IRQ. But after installing Win2k, I plugged in a USB Logitech QuickCam and it asked for drivers which I got from the QuickCam CD. It wasn't until later that I remembered that the USB was disabled in the BIOS! So, I checked and Win2k had reassigned all the IRQ settings for all the slots to a shared IRQ, and turned on the USB ports and assigned them an IRQ. Since my hardware all works great this is just fine for me, but this behaviour is going to be something that troubleshooters will need to be aware of.

I hope you find this helpful!

John Sloan
Technical Architect
MCSE, MCP+I

Thanks. I couldn't verify your experiment with the NT4 version of sol.exe, because I don't normally install games when I install NT. W2K, however, installs the games automatically, so I had them on all my W2K boxes. I tried running Solitaire on each of them, and you're right. The cards bouncing off the stacks at the end of the game is much slower on all of the systems. That sure explains it, because the video on all of these appeared to be operating at normal speed except for Solitaire. 

As far as the behavior you describe later in your message, I assume that's just the fact that W2K supports Plug-'N-Play whereas NT4 does not. I assume that if your BIOS has a setting for "PnP OS?" you must have set it to "Yes". If you set it to "No", I suspect you'll find that W2K can no longer mess with the IRQs you've mapped to particular PCI slots.

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Tuesday, 25 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


We had more snow and freezing rain overnight, with still more forecast for today, tonight, and tomorrow. It looks like we have an inch or two (2.5 to 5 cm.) of new accumulation, with possibly another 4 to 6 inches due to arrive over the next day or so. Low temperatures in the high teens (about -8C) and high temperatures at or just above freezing for the next few days mean this stuff isn't going away soon.

It may not have been my imagination when I reported earlier that Windows 2000 Professional seemed slower than Windows NT Workstation. The Register reports that Microsoft now concedes that W2KP is "comparable" in speed to NT4W on systems with 64 MB or less RAM, and slower than NT4W on systems with 128 MB or more. Quite a change from Microsoft's earlier position that W2KP was "up to 24% faster". 

I also question the reports of greatly increased stability with W2KP. It's not that I doubt that W2KP is very stable. I'm sure it is. It's that I doubt there's much basis for comparison, because NT4 is also very stable. I have had several systems running NT4 for years, and NT4 is about as stable as anything I've used. Some of my NT4 systems run for literally months between restarts, and those usually happen only when I need to power a system down for some reason like changing hardware. 

So, although I don't dislike W2KP, I don't really see that it's much of an upgrade from NT4W unless you're simply dying to use USB. Certainly, I've seen nothing that would make W2KP a "compelling" upgrade, as has been reported elsewhere. W2KP runs slower than NT4W (or, at best, the same speed), is no more stable, is quite expensive, still seems to have problems working properly with an NT4 DHCP server, and has problems running some software that works perfectly on NT4W. Microsoft hasn't even certified Office 2000 to run on W2KP (!), although it seems to work fine here. Admittedly, I'm in a small minority here, perhaps even a minority of one, but when I see a naked emperor I point him out.

* * * * *

-----Original Message-----
From: Chuck Waggoner [waggoner at gis dot net]
Sent: Monday, January 24, 2000 12:09 PM
To: 'Robert Bruce Thompson'
Subject: RE: More on Digital Music

> that's quite different from what I was talking about<

Agreed. I mentioned it because of the similarity of the challenge copyrights are facing, and the absurd protectionist steps courts and industries are taking in view of the technological tidal wave that is engulfing them.

I do believe that the biggest challenge content creators face, is how cheaply and easily a digital product can be duplicated and distributed--often requiring little or no effort on the creators' part. In this Internet era, it is becoming increasingly difficult to enforce protections based on the old, have-it-in-your-hand, sell-it-in-a-store model.

I also think the fact that there are people willing to create and distribute software for free, like Linux and some of its applications, has long-term implications, not only for Microsoft and others, but for other industries as well. What will the world look like if many of the old ways, like copyrights, become impotent or outmoded by the changes now coming to the surface?

Interesting that, just as we're discussing this, the RIAA has taken MP3.com to court for their "my MP3.com" service, whereby anyone can upload his own albums in MP3 form to the MP3.com server, and then access them from anywhere. What may be significant is that one doesn't actually upload the music if it already exists on the server. Instead MP3 flags the account as having access to that music, and simply uses the same music that someone else had previously uploaded. Oh, well. RIAA delenda est!

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Wednesday, 26 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


Thanks to all of those who wrote expressing concern about the winter storms we've been having around here. Actually, Winston-Salem missed the brunt of it. We got only about an inch (2.5 cm) of new snow from the big storm that passed here yesterday. Kernersville, about 10 miles (16 km) east of us, got about a foot (30.5 cm). Raleigh, a couple hours east of us, got 20 inches (51 cm). Some areas not far to the southeast of us got 24 inches (61 cm). Winston-Salem is in good shape--roads clear, power working, and so on--but those to our east really got nailed.

The Register posted another article this morning about Windows 2000 performance problems, this time with respect to video drivers. Apparently, many of the video drivers that ship with W2K are missing minor features such as support for hardware acceleration, D3D or OpenGL. It appears that some of the drivers shipping with W2K are enough to make your screen light up, but that's about it. That's a minor issue for Microsoft, of course, compared with the importance of meeting an arbitrary ship date. As I've said in the past, I recommend refusing to upgrade to Windows 2000 Professional until SP1 ships, and to Windows 2000 Server until at least SP2, and preferably SP3. SP1 is already scheduled for release mid-year, which should tell you something. Let the other guys end up with the arrows in their backs.

I've encountered something truly strange when attempting to process web stats. I've been using Analog for more than a year to process the raw web logs that pair networks provides into usable reports, with never a problem. I did the first week of this year, 20000101 through 20000107, with no problems. But when I downloaded my daily log file for 8 January, it caused Analog to GPF. I extracted the data from the zip and tried again. Same problem. The log files for 9 January and 10 January processed normally, as did Pournelle's log files for the whole period, so I figured that the 8 January file had something odd about it. Then the same thing happened on the 11 January file. I mailed the author of Analog, offering to send him the files. He mailed me back to ask for them, but I haven't heard anything from him.

Meanwhile, when I tried to run Pournelle's stats last weekend, Analog GPFd on his data. At that point, I was getting concerned enough that I emailed pair support and explained what was going on. They mailed me back yesterday evening to say that they'd run the reports there with no problem, using the raw logs that still resided in my directory on pair's server. They suggested that perhaps something had been corrupted during the download. Okay, that seemed easy enough to check, so I downloaded my 8 January raw log file, uncompressed it and ran it. Analog processed it normally. At that point, the problem seemed solved, other than trying to figure out why a file would have been corrupted during download. I re-downloaded the raw log file for 11 January, and Analog also processed it without any problems.

I'd already run the stats for 8 January through 14 January, but without the files for the 8th and 11th, so I decided to re-run that week's report with the new, good versions of the files for the 8th and 11th. Analog GPFd. Okay, I *knew* that the files for 1 January through 7 January were fine, because I'd already processed them and looked at the report. I still had the actual uncompressed files for 1 January through 7 January on disk, so I ran the report again. Analog GPFd. At that point, the only thing I could think of was that perhaps Analog was timing sensitive. I had it running on two different machines which exhibited the same problem, but both machines were pointed to the raw data located on a network server. Perhaps the problem was due to the stochastic nature of Ethernet and would be solved by running Analog against raw data stored on the local drive. I copied all the expanded log files over to the local machines and tried again. It blew up again.

At that point, I'd about decided that perhaps Analog 4.01, which I'd recently upgraded to, was buggy, so I reverted to Analog 3.11, which was the version I'd been using since last spring. The first couple of runs I made processed everything normally, so I'd about decided that Analog 4.01 was in fact the problem. Then I ran one last report, and Analog 3.11 GPFd. So at this point I have no idea what's going on. Analog seems to have taken a dislike to my data, my machines, or something.

I also did some experimenting last night with a product called Teleport Pro, which is an off-line web browser. IE4 and IE5 have this function built-in, but it's very limited. IE5, for example, allows you to go only three levels deep from the start page you specify, and it doesn't deal at all well with database-driven sites. I read the features list of Teleport Pro, and it appeared to be just what I was looking for. They have a crippled version available for download, so I downloaded it and gave it a try. It kind of works, in the sense that it went out and downloaded the site I pointed it at (within its limitation of 500 pages), but the downloaded local copy of the site doesn't function properly. 

I wasted a couple hours messing with Teleport Pro, and I'm about to give up on it. Does anyone know of an off-line web browser that actually works, particularly with database-driven sites? It shouldn't be all that hard to implement one. After all, all it has to do is automate the process of clicking on links and then store the resulting pages. Search engine spiders do that constantly. For dynamically-generated pages, it could simply assign a random page name to store the returned HTML page as, and create a map of the names it assigned to the string used to generate the page (in case that same data was referenced from another link). For that matter, with disk space so cheap, it could download and store duplicate copies of such pages. What I want is a product that I can simply point at a web site and tell it to go get the whole site. Is there such a thing?

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Thursday, 27 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


Those morons at the DVD Copy Control Association (DVD-CCA) would be funny if they weren't so evil. In the latest news, John Hoy, the President of DVD-CCA submitted to the court a declaration which includes the material that the DVD-CCA is trying to suppress, apparently not realizing that the declaration itself would become a part of the public record. So yesterday the DVD-CCA filed an emergency ex parte motion to seal the records. Duh.

More on the NT4 versus W2K speed issue in this article from The Register. It appears that it was not my imagination. W2K is slower than NT4, sometimes enough so to be noticeable. Microsoft has apparently been slanting the benchmark tests to make it seem otherwise. The fact that I noticed W2K's slowness most on one of my slower systems probably just means that the same percentage difference is much more noticeable on a slow system than on a fast one. After reading that article, the very next one I read was this one, reporting that RAMBUS is dead. After reading that, I just couldn't resist sending editor Mike McGee of The Register the following message:

-----Original Message-----
From: Robert Bruce Thompson [mailto:thompson@ttgnet.com]
Sent: Wednesday, January 26, 2000 1:41 PM
To: 'mike.magee@theregister.co.uk'
Subject: Timeliness

I read your page every day, but perhaps you should start reading mine. I declared RAMBUS dead in my daily journal for the week of 19 July 1999, and reported lower performance with W2KP build 2195 a couple of days before you guys broke the story. :)

To which he replied:

-----Original Message-----
From: Mike Magee [mailto:mike.magee@theregister.co.uk]
Sent: Thursday, January 27, 2000 5:14 AM
To: Robert Bruce Thompson
Subject: Re: Timeliness

Yeah -- but you've got to give a dying thing every chance, including artificial respiration, haven't you?

Best 
Mike

And.

-----Original Message-----
From: Mike Magee [mailto:mike.magee@theregister.co.uk]
Sent: Thursday, January 27, 2000 5:17 AM
To: Robert Bruce Thompson
Subject: Re: Timeliness

I've just checked out your pages -- it is very hard to navigate -- why don't you stick the top headlines right on the front, like we used to do at The Register before we went ballistic? 

http://194.159.40.109/news.html

And speaking of The Register, I see they're reporting this morning that the snow storm that dumped up to two feet of snow on the east coast sneaked up on us because the weather forecasters believed an IBM weather forecasting supercomputer instead of their own eyes. Reminds me of a weather forecast I heard once from a weatherman at a Vermont radio station: "Snow today. Don't know how much. Hasn't stopped."

* * * * *

-----Original Message-----
From: Alberto_Lopez@toyota.com [mailto:Alberto_Lopez@toyota.com]
Sent: Wednesday, January 26, 2000 10:42 AM
To: thompson@ttgnet.com
Subject: Installing Windows 2000 Professional on a dual boot pc with (Win98 SE and NT WS 4)

Robert,

Good Morning. Glad to hear you folks missed most of the snow... :>()

Question: I just finished building a PC around an AMD K-6 500 CPU, 128 MB of PC 100 RAM, 13.3GB HD, Creative 3-D Blaster Savage 4 Video card with 32MB of memory..

I was dual-booting Windows 98 SE and Windows NT 4 WS without a problem. All of my software was installed, my 384K DSL connection was working, My network was up and running, I mean, everything was great... Until...

I installed Windows 2000 Professional. It installed OK, detected all of the hardware correctly (except for the comm ports, no empty resources) and it seems to boot up OK, however.... It is now the ONLY OS on the HD.. It won't let me see Win NT 4 WS or Windows 98 SE... It says that the ONLY OS available to choose from in the menu under startup options is Windows 2000 Professional.

I FEAR that the MBR (master boot record) on my HD has been overwritten (again) and I would like to regain access to mu dual boot of Win98 and Windows NT WS without going back to BARE METAL (again)

Did I hose up the PC (again)? Am I faced with FDISK and REFORMAT (again)? What happened? What did I do wrong?

Please Help...

Thanks,

Alberto S. Lopez
albertol@pacbell.net
Torrance, CA

If you did a standard W2KP installation (without choosing advanced options), W2K setup probably converted your NTFS4 volumes to NTFS5, which NT4 cannot read. That would account for your inability to access NT4. As far as the problems accessing Windows 98, I'm not sure. Perhaps one of my readers will have an idea, and can mail you directly. I'm afraid that you are faced with stripping down to bare metal and re-installing, though.

* * * * *

-----Original Message-----
From: J.H. Ricketson [mailto:culam@neteze.com]
Sent: Wednesday, January 26, 2000 5:28 PM
To: thompson@ttgnet.com
Subject: W2K, RAM, Performance Degradation

Bob -

Everything I have read over the past 48 hours re the fudged W2K - NT4 speed comparison implies that W2K's efficiency peaks at 32MB ram, and degrades thereafter. This is incredible! Everything I have previously believed - that you almost can't have too much RAM - that the quickest way to boost performance (all other things being equal) was to add more RAM, is now seemingly negated by W2K's apparent optimal performance at the 32MB RAM level and degradation thereafter. 32MB is low even for NT3.51. It goes against all logic.

How could this happen? Microsoft is many things, of course - but not a one of them is stupid. How could the powers that be, and the beta testers, let this pass during months of beta testing?

Really scary. I've just bought 256MB for a new box I plan to build. That should really drag down the performance of W2K.

Or am I completely misreading the fine print of the "fudged" test? Care to comment?

Regards,

JHR 
--
[J.H. Ricketson in San Pablo]
culam@neteze.com

I think you're misinterpreting the statements. It's not that W2K's efficiency peaks at 32 MB. It's that its performance relative to NT4 peaks at 32 MB. For example (using entirely imaginary numbers), at 32 MB W2KP may provide 1.00 TU/s (Thompson units per second) performance while NT4W may provide only 0.96 TU/s. At 64 MB, W2KP may provide 2.50 TU/s, and NT4W 3.00 TU/s. At 128 MB, W2KP may provide 3.00 TU/s and NT4W 4.00 TU/s, and so on. More memory helps either W2K or NT4, but it helps NT4 more. I haven't done any tests yet with W2KP, but NT4W appears to be usable in 32 MB, happy in 64 MB, and delighted in 128 MB. For most users, more than 128 MB yields diminishing returns. That's not true, though, if you run seriously memory-hungry apps (like Photoshop) or if you keep many, many windows open. More memory is always a Good Thing, unless of course you're running a Pentium system with a chipset that doesn't cache beyond 64 MB, in which case installing more than 64 MB will actually slow the system down.

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Friday, 28 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


I think I'm coming down with something. I've almost lost my voice. When I attempt to speak sternly to Malcolm, I squeak. I can't remember the last time I took a day off, so I think I'll do that today and just lie around and read. I do have to do the weekly backups, though. More tomorrow, or perhaps not, depending on how I'm feeling.

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Saturday, 29 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


Thanks for all the get-well messages. My voice still isn't working very well, but I think I'll live. Barbara took care of mom and the dogs all day yesterday, and I just lay around reading and drinking chicken-noodle soup. I polished off the latest W. E. B. Griffin and a couple of mysteries.

In between books and soup, I also ran the weekly backups. Barbara's machine, theodore, is also the main network data store. It has a Tecmar Travan NS/20 tape drive. My own main machine, kiwi, has a Tecmar 3900 DDS-3 "DAT" drive. I've gotten in the habit of doing a local-only backup on theodore (because that's where all the data is) and a full network backup from kiwi

One reason I do it that way is that theodore runs Windows NT Server 4 and kiwi runs Windows NT Workstation. Tecmar included two versions, server and workstation, of Arcada/Seagate/Veritas BackupExec with the drives. The workstation version can back up any volume on the network that has a drive letter mapped to it, but will install only on NT Workstation (not NT Server). The Server version will install on an NT Server machine, but can back up network volumes only if they reside on machines running NT Workstation or Windows 9X (not NT Server). That means, for example, that I can't use Barbara's machine to back up data from kerby, which is my former main workstation and is currently the BDC for the network and my WinGate server. Volumes on kerby are visible, but grayed out. I guess that's why they call it "single server edition."

After I'd started the network backup on kiwi, I sat down at Barbara's machine to retension a tape. For some reason, I got to wondering if the BackupExec workstation version checked the OS version only at install-time, or did a run-time OS version check. So I copied the BackupExec Workstation folder from kiwi over to theodore and ran the executable to see what happened. The splash screen appeared normally, but the program load aborted with a missing DLL. I copied that missing DLL over from kiwi, fired up BackupExec again, and it again blew up, this time with a different missing DLL. Being stubborn, I continued finding and copying those DLLs from kiwi over to the main BackupExec directory on theodore. After copying a couple of them, I found an executable, versions.exe, in the BackupExec folder and ran it. It provided a list of the installed executables and DLLs, with question marks next to the missing ones. That made it easier to track down which DLLs were missing, and I got them all copied over. BackupExec Workstation then fired up and ran normally on NT Server.

Well, normally except for one thing. The drop-down list for destination showed "No devices found". After a short look around for configuration files, I concluded that that information must be kept in the registry, so I started spelunking that on both machines. Part of the problem is that these software versions apparently overlapped the change from Seagate BackupExec to Veritas BackupExec. The workstation version (on kiwi) had HKLM registry entries under Software\Seagate. The server version (on theodore) had registry entries under both Software\Seagate and Software\Veritas. The Seagate registry entry on kiwi had a Devices key, which was not present under either Seagate or Veritas on theodore. I couldn't find anything at all on theodore that was obviously tape drive configuration information.

I could simply have copied the Devices subkey from kiwi over to theodore, except that the machines use different drives. Short of removing the Tecmar Travan NS/20 tape drive from theodore, installing it in a machine running NT Workstation, installing Seagate BackupExec Workstation on that machine, and then copying the appropriate registry keys, I didn't see any obvious way to get the information I needed, so I simply punted and used the Server version on theodore to do the backup.

Also, the Tecmar 3900 DDS-3 tape drive in kiwi keeps track of how much use it's gotten since the last cleaning. I've never cleaned it, and when I started the backup it decided it was due for its first cleaning. That presented a problem, because I don't have a cleaning tape. Thinking about that, I realized that the Tecmar Travan NS/20 tape drive on theodore could probably use a cleaning too, although it's too polite to complain. That sent me off to Insight, NECx, and PC Connection in search of both DDS and Travan cleaning tapes. That was harder than it should have been.

Everyone had DDS cleaning tapes. Like DDS tape cartridges, they're cheap, about $6 to $10 or so. I had my choice of major brands like HP, Sony, Tecmar, Seagate, and so on. Any of those would be fine. But when it came to Travan cleaning tapes, there was a distinct shortage of products. Having once seen a tape drive destroyed by a rogue cleaning tape, I really prefer to clean my QIC80 drives manually with alcohol, but the Tecmar Travan NS20 manual made a point of insisting that you should use a dry cleaning tape because using liquids might damage the drive and would void your warranty. So I definitely wanted a dry Travan cleaning tape.

One problem is that Travan cleaning tapes, like Travan data cartridges, aren't cheap. I was able to find some at $30 per cleaning tape or so, which I wouldn't have minded paying. The problem was that they were available only in five-packs, and I certainly didn't want to spend $150 for five when I only needed one. Neither Insight nor NECx was able to sell me what I wanted--one each DDS and Travan cleaning tapes. I was finally able to find what I needed at PC Connection, but I decided to check around locally before I ordered from them. 

I used to buy regularly from PC Connection. Their prices were usually a bit higher than other reputable sources, but they delivered overnight for a flat $5 shipping charge (unless you ordered a monitor, printer, system, or other extremely heavy product). They used to promise that if you had an order in by 3:00 a.m. it would be delivered by 10:00 that day. I remember being over at our friends the Tuckers late one Friday night/Saturday morning. I ordered something from PC Connection at about 1:30 a.m. Saturday morning, which in fact turned up at our door less than eight hours later. It usually worked out that the total cost with shipping was about the same from PC Connection as from Insight or NECx, and PC Connection never caused me any problems. Neither did Insight or NECx, for that matter.

At any rate, PC Connection changed shipping policies a while back. They now charge variable shipping depending on weight. Apparently, their minimum is $10, because that's what they stated as the shipping charge to ship two tapes to me. And they no longer offer standard Saturday delivery. A $10 shipping charge on a $40 order is pretty significant, so I decided to check around locally first. I'd just as soon pay $10 more to a local place.

We're supposed to get another winter storm, so I suppose we'd better batten down the hatches. This one supposedly moves in this afternoon and continues through tomorrow night. Perhaps we'll get only an inch or two this time also, but then again we may not. It appears this one is likely to be an ice storm rather than a snow storm, so things may get a little bad around here. Ice storms tend to kill the power, and a bad one can take the power out for days at a time. Oh, well. We have food, a generator, and natural gas logs, so we won't suffer.

* * * * *

-----Original Message-----
From: See, Larry [mailto:Larry.See@am.sony.com]
Sent: Friday, January 28, 2000 8:29 PM
To: 'thompson@ttgnet.com'
Subject: RE: W2K, RAM, Performance Degradation

Dear Mr. Thompson,

In your response to JHR, your reference to chipsets that don't cache beyond 64MB gave me a shiver of concern.

I'm the guy that got the project to upgrade the department desktops. Part of that is, of course, more memory - to 128MB.

Could you give me a quick steer in the direction of more information, or do you happen to know a test for the "past 64MB cache" problem?

I would be grateful for something to plan with, should you have it.

Many thanks!

Larry See

You shouldn't have anything to worry about. What I was referring to was Socket 7 systems, not many of which are likely to be upgraded to W2K. For Pentium II/III and Celeron systems, you won't have a problem. The problem arises with Pentium systems, because many Socket 7 chipsets supported more memory than they could cache. For example, the latest Intel Socket 7 chipset, the 430TX, supports up to 256 MB of memory, but can cache only 64 MB. The immediate predecessors of the 430TX, the 430HX and 430VX, were "twin" chipsets targeted at different markets. The HX was the "high-end" chipset, with support for dual processors, and the ability to cache 256 MB. The VX was the "economy" chipset, and could cache only 64 MB. Most systems of that vintage used the VX, which was much cheaper. I have two of those sitting here: a Dell Dimension XPS M200s (200 MHz Pentium) and a Gateway Pentium/133. Both use the VX chipset, and both are effectively limited to using 64MB, although more can be physically installed.

Intel was not the only one guilty of making chipsets with a 64 MB caching limit. Many competing chipsets from VIA, ALi, SiS, and so on had the same limits. And, in fairness to the makers. that wasn't really a limitation when the chipsets were designed. No one thought for a moment that anyone would ever install more than 64 MB, because memory was simply too expensive back then to make that feasible.

 


 

 

 

Search [tips]

TTG Home

Robert Home

Daynotes Home

Links

Special Reports

Current Topics


Sunday, 30 January 2000

[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]


We have ice. There appears to be half an inch (1.25 cm) or so on the ground, but it's still coming down and is supposed to continue through tonight. Our forecast high today is 30F (-1C), so it's likely to be staying around for a while. Looks like the kids will be getting another day off from school tomorrow. I think Barbara said we're supposed to get more later in the week, too. This has been a very wintry January for Winston-Salem. I blame it on Global Warming.

* * * * *

-----Original Message-----
From: maceda@pobox.com
Sent: Saturday, January 29, 2000 5:47 PM
To: webmaster@ttgnet.com
Subject: Chipset caching ability

"The HX was the "high-end" chipset, with support for dual processors, and the ability to cache 256 MB. The VX was the "economy" chipset, and could cache only 64 MB."

Actually the HX was able to cache the full 512MB as long as you had 512KB of L2 cache. I had to face the question of upgrading TX and VX systems a couple of times and my rule of thumb was that it is better to have uncacheable memory than disk thrashing. If you use memory hungry apps like Photoshop more memory is better (even if the chipset cannot cache it).

Francisco García Maceda
maceda@pobox.com

Thanks for the correction. I believe you're right that the HX can cache up to 512 MB rather than 256 MB as I stated. That's what happens when I give an answer off the top of my head instead of checking. As for the rest, I've not tried running Photoshop or any other memory hogs on such a configuration. With the availability of cheap Celeron systems, running something like Photoshop on a Pentium makes no sense in a corporate environment anyway. 

The point I was trying to make is that Windows 9X, Windows NT Workstation, and (presumably) Windows 2000 Professional are all happy in 64 MB. Going above 64 MB provides some return, certainly, but even on a system that can cache more than 64 MB, the improvement is relatively small compared to the jump from 32 MB to 64 MB. The real problem is that Windows loads top-down in memory, and that means that if you install more than 64 MB in such a system, you'll be running Windows 100% of the  time in uncached memory. My experience is that the performance degradation from doing that is more than the marginal performance improvement from having more than 64 MB. I actually installed 128 MB in such a system one time, just to see what would happen. I ran quite a few benchmarks, and they all showed that the system was slower with 128 MB. It also "felt" slower, although that may have been simply because I was expecting it to be.

This discussion is probably moot anyway, as there is little reason to upgrade any Pentium system to more than 64 MB in a corporate setting. Given the costs involved in any corporate PC upgrade, it'd make more sense to replace the old PC with an inexpensive Celeron/400 or similar system that comes pre-configured with Windows 2000. Attempting to upgrade an elderly fleet of Pentium systems to run Windows 2000 is likely to be an exercise in frustration. Unless the systems are identical, and perhaps even then, the staff time required will be so great as to subsume the marginal costs of the drop-in upgrade.

* * * * *

-----Original Message-----
From: Dave Farquhar [mailto:dfarq@swbell.net]
Sent: Sunday, January 30, 2000 12:24 AM
To: Larry.See@am.sony.com
Cc: thompson@ttgnet.com
Subject: Chipsets and memory >64MB

If you want a sure-fire way to know whether a given system will profit from more memory, head over here to pick up a copy of SiSoft Sandra. It'll tell you, among other things, the chipset the system uses.

Whether to put more than 64MB in an Intel VX or TX system is a tough call. You're crippling your CPU, but by the same token, uncached memory is still much faster than virtual memory. Most of the users I support easily outstrip 64MB by keeping Outlook, Word or Excel, WRQ's Reflection telnet client, a Web browser, and a half-dozen gimmick programs open all the time, so I usually add a pair of 64MB SIMMs or a 128MB DIMM in those types of users' systems regardless of the chipset they're using, because their drives are always grinding. Typically they'll end up with a total of 144-160 MB of system RAM afterward (I leave as much memory as possible in the box--if all their banks are full, I pull the smallest DIMM or pair of SIMMs).

One nice thing you can do with Sandra is view memory usage. So you can tell a user to load every piece of software s/he ever uses at once, then launch Sandra and how much virtual memory is in use. Then you can determine whether you should add memory.

Personally, I'm much more concerned about how certain video and sound card drivers interact in W2K than I am about memory. I hope you're standardized, and I suggest drafting a cross-section of systems ASAP and testing W2K on them to see how it performs.

Dave
---
Dave Farquhar
dfarq@swbell.net
www.access2k1.net/users/farquhar

That's fine for Windows 9X systems, but we're talking about Windows NT Workstation and Windows 2000 Professional systems, for which SiSoft Sandra has only very limited functionality, or so the president of the company tells me. The problem, of course, is that NT and W2K isolate the underlying hardware from any application, including Sandra. All Sandra can do is report back to you what the operating system tells it, which is of very limited utility. I downloaded the free version of Sandra a couple months ago, and then contacted them with questions about the full version, W2K, and so on. The problem is that Sandra is a Windows application, and so cannot be run from a DOS boot floppy. For that reason, I regard Sandra as a user-level diagnostic program. I prefer using an industrial-strength diagnostic program like CheckIt Pro.

As far as keeping multiple programs open and expanding memory beyond 64 MB, see my response to Francisco García Maceda. I still think it's a bad idea.

 


[Last Week] [Monday] [Tuesday] [Wednesday] [Thursday] [Friday] [Saturday] [Sunday] [Next Week]

 

Copyright © 1998, 1999, 2000, 2001, 2002, 2003, 2004 by Robert Bruce Thompson. All Rights Reserved.