File: CONSPIRE.TXT Guy Dunphy 1993 onwards... IBM, and the crippled PC These are some notes on the theory that the IBM PC & DOS are the result of a conspiracy to cripple the development of personal computers. Also, specific flaws of the PC are discussed. (A bit of screaming, too.) This file was begun as a joke, spurred by general frustration with PC's. Some of the things I've come across though, are hard to believe in any light other than as the results of a conspiracy. Even the simple explanation of gross stupidity on IBM's part grows thin in the face of some PC 'features'. The more I study this matter, the more sinister it all begins to seem. However, please understand that I am not emotionally attached to this conspiracy theory. It's more in the class of a 'fun idea', a bit like the Tooth Fairy story, that lends a dimension of mysterious excitement to an otherwise bloody and unpleasant experience. If someone carefully and logically disproves the whole thing to me, I am not going to cry and act like something precious has been taken from me. Don't be surprised though if you still sometimes hear me saying "I believe in Fairies!" and "Screw IBM!". A more conventional view is presented in the book 'Accidental Empires, by Robert X Cringely. ISBN 0 14 01.7138 X'. Still, I prefer a good conspiracy theory any day. Currently, this file is intended more as a compendium of known PC flaws, to warn other users and serve as a 'how not to do it' reference for designers of future computer systems. I am distributing it both by itself, and as part of the Technology appendix of the Evergreen work. I would be very gratefull to anyone who could send me technical details of other fundamental screwups they have found in the PC and associated systems. Stories of run of the mill bugs, and flaws in specific clones are not what I want, just stuff thats really outrageous and machiavellian. And now back to our regular paranoia... Motives ------- Why would IBM want to cripple personal computer development? Three general classes of reasons suggest themselves:- 1. The desire of 'the establishment' to withhold computing power, with all its potential for free information dissemination, record keeping, etc, from the general public. 'They' would consider such powerful tools to be theirs alone by right. Since power in our society depends on information, and particularly on the suppression of information, powerful information handling tools must be withheld from the public at all cost. 2. IBM makes mainframe computers. Even back then it should have been apparent to IBM's forecasters that with free development, 'small' computers would rapidly become powerful and cheap enough to make mainframes of any size/speed obsolete. That's the last thing IBM wants, since small computers can be made by any tiny factory. IBM would not be able to compete, and bye bye IBM. (A good thing if you ask me.) 3. If large numbers of uncontrolled software developers were allowed to work at will using private development systems that did not shackle them with numerous mental overheads, it could happen that the rate of software science advance would become very high. As a result, the sort of antique software produced by IBM, etc for mainframes, would become even more rapidly and obviously laughable than it already is. Furthermore, if software production became sufficiently advanced that small independent teams (or even individuals) were able to produce significant applications in short periods of time, then the highly lucrative software market would dry up for the large, slow to react companies like IBM. Heavens, you might even see major utilities appearing in the public domain! Methods ------- So, (IBM thinks) given the general aim of slowing down personal computer development as much as possible (or even halting it), how to go about it? We can assume that the people in charge are not fools (extemely clever, actually, to judge by the subtlety of their actions). Just plain banning personal computers is not an option (at least not yet, though see notes elsewhere on hackers, bulletin boards, neo-luddites, etc. Also, Bruce Sterling's book The Hacker Crackdown is very interesting.) They were aware that any standard architecture that evolved could become widely available due to cheap mass production by 'clone' makers. Now, IBM has always had the ability to produce hardware that contained elements resistant to clone production. Those cute rectangular aluminium encased PGAs with the upside down mounted chips that IBM makes, for instance. Not to mention their ability to make PCBs with about 5000 layers of microscopic embedded tracks and more solder than fibreglass. Also, IBM has the legal muscle to squash clone producers if they wanted to, and you'd think they would want to, if they were just in the business of selling computers. They also recognised that it was no use just producing a crummy piece of hardware, trying to make everybody use it, and preventing anyone else from copying it. All that would happen would be that another (better) standard would be developed by someone else. This would then be adopted by the cheap clone makers, and IBM would have no control over further events. Required attributes ------------------- From this reasoning, comes the conclusion that what was needed was a product with these attributes:- - Should appear to be pretty good, to all but the most expert and careful examination, preferably in hindsight only. - It would be cloneable, while allowing IBM to pretend that they were dead against the idea. Hence it should not contain any technology obviously proprietry to IBM, or if it did, such elements should be easily substituted with independently developed stuff. (I'm talking about the BIOS). The general aim is for IBM to be able to look like they're doing their best to stop cloning, while being sure of losing. - The cloned machines should be cheap enough to hopefully take over the entire market, hence blocking the development of other architectures. - Fundamental features of the architecture should present impassable obstacles to future evolution along certain 'dangerous' paths. - The architecture should contain as much excessive complexity as possible. The aim here is to waste as many man-hours of as many independent software developers as possible. In general, time spent screwing around with obscure 'features' such as segmented registers, stupid memory maps, byzantine instruction sets, etc, is time not spent developing dangerous applications like simple networking software. - A further result of the 'excessive complexity', is that it presents a barrier to useful understanding of the system by novice programmers. Another thing that gets up the establishment's nose, is self taught computer experts (you never know what they might get up to). So you make the computers so bloody complicated that it takes years to get good enough to really do anything fundamentally new. For those that do manage to become competent, create a derogatory term ("hacker") with which to label them, then start work on brainwashing the public into equating "hacker" with "criminal". Every smart person turned off the path to computer wizardry is another threat removed, so far as the establishment is concerned. - A further method of enhancing the "time wasters" is to not make fundamental technical documentation available. eg The bus timing specs. What we got in the end ---------------------- Hitch-hiker's guide to the galaxy: "Earth: Planet responsible for creating the IBM PC. Otherwise harmless." There is a massive industry wholly dependent on the existence of the IBM PC, and the publishing sections of that industry seem to do nothing but sing endless praises to the glorious, powerful, fast, etc, etc, machines we use. I wonder if all those people really belive the machines are so great? Maybe they just can't imagine anything better? Anyway, here are a few of the more blatant flaws I have come across in my battles with this stinking heap of shit some dare to call a computer. Flaws in the PC hardware ------------------------ * The floppy disk select lines screw-up. It is impossible to imagine that this is accidental. The only possible scenario that could have resulted in this arrangement, involves an intent to prevent users from connecting more than two floppies (and to make even two as difficult as possible). Briefly, the standard floppy drive connector has four 'drive select' lines, so allowing four drives to be connected to one cable and controller (in a real computer). This was so, long before the PC arrived, and still is today. IBM, though, (bless them with boiling oil) decided that four drives was far too many, two would be reasonable, and just one would be even better. So they designed the PC floppy controller card with DIFFERENT signals on the floppy cable! Here are the signals, per IBM vs everyone else:- Note: all signals are active low (same as IBM's ethics).. CABLE PINS IBM PC AT (STUPID) STANDARD FLOPPY GND SIG I/O CONTROLLER SIGNALS DRIVE SIGNALS --------------------------------------------------------- 1 2 I RWC, LD Reduced write 3 4 - Not used Reserved 5 6 I Not used Drive sel 3 !@#% 7 8 O Index Index 9 10* I Motor en A Drive sel 0 !@#% 11* 12* I Drive sel B Drive sel 1 !@#% <---- Drive B: 13* 14* I Drive sel A Drive sel 2 !@#% 15* 16* I Motor en B Motor On !@#% 17 18 I Direction Direction 19 20 I Step pulse Step pulse 21 22 I Write data Write data 23 24 I Write enable Write enable 25 26 O Track 0 Track 0 27 28 O Write protect Write protect 29 30 O Read data Read data 31 32 I Head select Side 1 select 33 34 O Disk changed Disk changed Most PC floppy cables have a group of seven wires (marked '*' above) flipped between the two floppy connectors. This is a kludge needed to even get two drives to work. Its effect is (for the 2nd, ie further drive):- CONTROLLER SIGNALS NOW ON PIN TO DRIVE INPUT 16 Motor en B 10 Drive sel 0 15 GND 11 GND 14 Drive sel A 12 Drive sel 1 <---- Drive A: 13 GND 13 GND 12 Drive sel B 14 Drive sel 2 11 GND 15 GND 10 Motor en A 16 Motor On By the way, the PC software seems to drive the 'Motor en' and Drive sel' lines identically, which makes IBM's hardware even more pointless. So, with the twisted cable, you jumper both drives as "Drive 1", and the one closest to the controller becomes "B:", with the further one as "A:". Without the twisted cable, you need to cut the track to pin 16 on the controller, jumper the drives as 0 and 1, and make sure both drives are jumpered to take their 'motor en' from the drive select, not pin 16. Given that floppy drives for PCs are NEVER supplied with ANY documentation, that may be difficult. Another interesting point is that the BIOS Int 13h floppy disk functions all accept a drive number parameter of 0 to 3. Could it be that the system was originally designed to access the standard four drives? * On the subject of floppy disks: 5.25" disks are write protected when the slot is covered. So whose bright idea was it to make 3.5" disks write protected when the hole is _open_? Trivial but stupid changes like this really irritate me. * The weird clock and interrupt rates. 18.2 milliseconds per tick? Arrgh! In fact it's not even exactly 18.2 mSec. The non-integer tick time is a result of the strange crystal frequency chosen, which does not conveniently divide down to anything reasonable like accurate 10 or 20 mSec periods. This makes generating commonsense time intervals like n seconds, a real pain. * Speaking of clocks reminds me of the Real Time Clock, and that reminds me of the pathetic few tens of bytes of battery backed CMOS memory this has. Even way back then, 2K byte CMOS RAM chips were available. Including one of these, with proper power down, etc protection would have made a big difference in the PC's general usefullness. In general, to be usefull for any sort of real world control, a computer needs at least these features:- - Watchdog reset to restart machine automaticly after a crash. - Power fail warning interrupt, at least 10mSec before power actually goes. A system hardware reset should also activate just before power does go. This is easy to do with switching power supplies. - Solid protection on the CMOS RAM, to preserve data during power downs, wild processor operation, etc. - A 'real-world status register' giving info like:- - Cause(s) of impending reset, and (optionally) time remaining till reset. - Whether last reset was due to manual, watchdog fault or power fail. - Whether the last power-down was due to power loss or manual shutdown. - Whether the last power-up was due to manual, RTC, or external request. - An RTC with a software setable time & date 'alarm' output, which may be used to cause the machine to power back up if main supply power is OK. There should also be an external 'request power-up' input, as well as the ability to set peripherals such as keyboards, mice, etc in a 'halted, request power up on user action' mode. - A maskable interrupt derived from the zero crossings of the mains supply, if that is used. This can be used for numerous purposes, such as- - Determining the processor clock speed, and/or the international 50/60Hz mains type used. - Doing drift calibration/adjustment of the internal clock chip. - Allowing phase control in software of mains loads. Naturally, the original PC had none of these features, nor have they emerged as standards since. So you still can't do any practical and usefull physical process control with standard PCs. (Not safely anyway.) * The ridiculously over complex and vague bus timing/protocol. Have you ever looked at a PC bus with a logic analyser? It SUCKS! If I wanted to be kind to IBM, I'd say the reason they never released a formal specification for the interface bus is that they're too embarrassed. More likely, though, they just wanted to add to the 'time wasting' class of developer crippling features. * The processor family. Segmented addressing, and the amazingly complicated 'protection' stuff in the later chips (have you ever tried to read the Intel data books on the 386, 486, etc?). Actually, I think they mean "protection against being used by almost anyone". 80x86 CPUs are 'little-endian', ie they store the least significant byte of words in the lower address. This is a crime against nature, and worse, it makes looking at data in memory with debuggers, etc, a pain. Also: The 80x86 family instruction set is not well suited to creating code that is position independent 'as is'. The result of this is the ridiculous business of .EXE files, and the requirement for DOS to patch all those absolute refs each time an EXE file is loaded. How many generations of programmers are going to have their heads screwed up with this crap? * The insane I/O memory map. What did they do, draw addresses out of a hat? The first thing wrong is that the CPU has seperate I/O and memory spaces at all (an Intel feature). Then the I/O address map is the most unbelievable mess you ever saw. On top of that is the scattering of great chunks of fixed purpose address space into the memory space between 640K and 1Meg, together with the various schemes to salvage sense from this by 'shadowing' bits of that space with the overlaid RAM space. Finally there's the awfull mix-ups to do with Expanded and Extended memory beyond the 1Meg barrier. Don't you love those names! So obvious that one refers to memory added by shuffling extra pages into gaps in the top of the first 1meg, while the other simply raises the upper address range. Of course I don't need to mention which is which, or which one is also called EMS. Then we have the little matter of the many 'standard' methods that have arisen to control how the extra memory is used by applications. How long would it take you to find a good explanation of what DPMI or VCPI mean for you, and how to program them? * Non Open Collector driven interrupt lines on the peripheral bus. In sane computers, interrupt lines have pullup resistors, and peripheral cards drive them with open collector outputs. This allows several cards to be 'wire or-ed' on a single interrupt line. The interrupt server routines are then daisy chained, with each checking its own hardware. This works fine. However, PC peripheral cards drive the interrupt lines with normal TTL outputs, so only one card can use each interrupt line on the bus. This is so dumb that it is hard to believe even an incompetent, novice hardware designer could have come up with the idea. Sounds more like the result of a 'management directive' to me. * Another line with the wrong polarity is the RESET line (pin B2) on the peripheral bus. This is active high too! Why?? It must be the only active high reset line in the entire universe! It is driven onto the bus by a standard 74LS00 gate, which means that cards on the bus cannot initiate a hardware reset, since to externally force a standard gate's output high risks destroying the gate. Even if the line _could_ be pulled high, it would do no good since the bus RESET line is not the true origin of all motherboard reset circuitry. The right way to do it, is to have an active low bus reset line, pulled high by a single resistor, and pulled low by open collector drives from whatever sources of reset action are required (both on the motherboard or add-on cards). In general, the PC's hardware reset circuit is so weird, so different from standard electronics practice, that one must either assume gross, mindboggling incompetence, or some strange ulterior motive on the part of it's designers. * Ridiculous circuit design. Ever seen the circuit for the parallel ports? Apart from general stupid excess complexity and lack of regularity, there is one particularly infuriating, over the top, omission. The 8 data pins on the plug are driven by a latched output reg (a 74LS374), and readable via an input reg (a 74LS244), BUT! the output reg has its ~OE (output enable, active low) pin grounded. Thus this chip always drives the pins, preventing the port being used for input. Now the odd thing is, that the latch for the printer control signals (a 74LS174 yet) has its highest Q output (latched BD5) unconnected. Strange? Looks like someone originally designed it right, with the 174 Q5 driving the 374 ~OE, giving the ability to tri-state the port driver. Then I guess the bad guys got to the circuit, and no more I/O PIO. If you look at the original schematic as published by IBM, the change practically leaps out at you. Well drawn schematics have a kind of 'artistic balance', with evenly spaced detail over the area of the page. This schematic is like that - except for one unbalanced blank area. Which just happens to be where a line between the 174 Q5 pin and the 374 ~OE pins would go. Put it back in, and the page looks perfectly balanced. I can just see the scene: Manager: "No, we don't want the port to be bi- directional." Engineer: "OK boss.(rub rub rub)" * Serial ports. What! Only four at most? Also, notice that the standard serial chip used has 40 pins. Thats for just ONE port. There are perfectly good quad serial port chips that size. So why not used them? You'd be able to fit too many serial ports on one card, thats why. Other amazing things:- - There are only 2 bus interrupt lines allocated for serial use. - DOS/BIOS functions regularly disable interrupts for long enough to lose serial characters above about 2400 baud. This is why the only cards that really work at high speeds are those with long FIFOs, either in the serial chips, or seperately implemented. - The PC uses MALE DB25s for its serial ports. They are supposed to be FEMALE on a DTE (DTE means computer, in this case). - The BIOS does not properly support even all 4 ports. - The memory map mess is particularly nasty for the serial ports. Not only are serial port IO addresses out of order and non-contiguous, but they don't even have fixed logical name mapping. Whether a certain port is called COM2 or COM3 for instance, depends on what other ports are there! Is there ANYTHING about the serial ports those ****'s didn't manage to screw up? Not really. Now is that surprising considering that the ability of a computer to connect to the outside world is vital? Just the sort of thing that you'd want to give special attention to if you were trying to limit the usefulness of a machine architecture. * The lack of a correctly functioning 'I/O ready' line on the bus. Hence the hassles with cached systems that leave insufficient time between I/O cycles, causing random crashes, etc. The only way to solve this is to bracket all I/O ops with two long jumps to flush the cache and force a memory cycle to occur between the I/O ops. Dissassemble the BIOS of any fast cached 386 up, if you don't believe me. This great feature makes it impossible for people not in the know to write low level code that works reliably on all machines. Also, some machines seem to cache I/O operations. Great! Later chipsets fixed this problem, thank goodness, and the shift to PCI bus will make it entirely moot. * DMA screwup. Wraps around in 64K byte pages. Arbitration is stupid- one 'chip B' cycle can occur for each cycle of chip A. DMA not implemented for hard disk: no Dreq/Dack from disk controller. This is really pathetic, means the processor gets tied up doing disk accesses. This would have to count as one of the most serious omissions in the PC - that the major data storage device has no means for automating bulk transfers of data to and from memory. * Speaker interface. OC transistor, not push pull. This circuit is a direct copy of the Apple II speaker output. Because the circuit is open collector, driving the inductance of the speaker, you can't even do reasonable pulse width modulation, because the speaker just rings at its resonant frequency. Also, having the logic +5V supply as the speaker supply is a boon (boom) if you want to connect an external speaker. Be REAL careful of shorts! Contrast this with the Mac's DtoA with dedicated DMA channel! The Mac came out around about the same time, remember. One amusing result of this design, for those of you with SoundBlaster cards who have connected the motherboard 'Spkr out' to the SB 'Spkr in':- If you use the SoundBlaster mixer utility to turn the 'PC spkr in' source right up and also crank up the master output volume, you can hear the processor noise on the 5V power rail. This is because with no actual spkr output, the 5V rail's AC noise is effectively being input to the SoundBlaster. When running some busy programs (eg Windows) you actually get some neat sounds. * Has anyone ever seen a dimensions drawing from IBM for the peripheral cards? All the ones I've seen have been 'derived', ie measured from existing cards. I don't think IBM ever published one. * The address lines on the ISA extension sockets. The original XT bus socket (A & B) has address signals SA[0..19], which allows access to memory 0..FFFFF (1 Megabyte). The AT added the short socket (C & D) with address signals LA[17..23]. a. Stuffed if I know why they duplicated A17..19. The signals have different timing, but are otherwise the same old address bits from the CPU. The IBM tech ref makes no mention of any reason for this. b. LA20. Bloody Hell! Get this: The 8042 keyboard controller has a port output bit (P21) used as a "force A20 low" line. From the Compaq 386s/20 Tech Ref manual, pg 5-2: "Driving A20 low allows for compatibility with certain application programs written for the 8088/8086 that expect automatic segment wraparound at the end of the 1-megabyte address space." So we're stuck with this forever, eh? c. LA[21..23] A whole 3 bits of extra address! Oh thank you, IBM. d. The 80386 A[24..31] are not connected to the expansion bus sockets. A31 is used only on the motherboard for PAL decoding, etc. * The IDE hard disk interface. Now here is an example of a more recent stuff-up. I don't know (or care) who developed this standard, but it sucks just as much as anything in the original PC design. For instance:- - The ribbon cable to the drive brings some unbuffered motherboard bus lines out on the cable to the drive! This is guaranteed to result in noise, reflections, capacitive loading, etc on those bus signals. - Early drives had a bug that prevented use of two drives. Not surprising, considering that the drive implements a set of control registers in the PC's address space - so with two drives there are two duplicate sets of registers at the same address. Supposedly the bug has been fixed, but I hear it's still a pain to get two drives going together. - It's another 'two drives only' waste of time. OK, so motherboards now have 'primary' and 'secondary' IDE interfaces, and so four drives are allowed (three, plus a CDROM.) Meanwhile, SCSI, which allows up to seven drives plus controller, remains expensive because its 'less popular.' This is a fine example of how an originally poisoned standard (IBM's ST506 drive interface, that should have allowed 4 drives but only did two) can poison all subsequent derivatives unless someone has the guts to do a clean redesign. In this case, no one did. * The joystick interface. Again, this is is clear rip-off of an Apple-II circuit. It uses a quad pulse generator chip, the 553. The joystick pots are connected in 2-wire mode (one end and wiper) between +5V and a cap to ground. When the 553 is triggered by software, it discharges all four caps, starting an output pulse for each one. The caps charge up via the joystick pots, and as each one gets to a threshold voltage, the output pulse for it is terminated. The four output bits are software readable as bits via port 201h. Software polls these continuously, and counts the time each bit takes to go low again after a trigger. Needless to say, this is the crappiest method imaginable for getting analogue input into a computer. You couldn't do worse if you tried. It's defects are:- - Non linear. The joystick angle vs pulse width proportionality is lousy. - Prone to glitches due to poor wiper contact on the pot, since it is really a current sensing input. - Subject to component tollerances and drift. The signal range depends on just about every characteristic of every component. For example, most joystick interfaces use cheap ceramic disk caps for the timing caps. These have about a +/- 30% tollerance to begin with, and also have a high thermal drift coefficient. Thats why your joystick centre seems to drift as the computer warms up. - Uses up a ridiculous amount of processor time. It has to sit in a loop, just reading that port, every time it wants to get a joystick position. - Cannot be used to input external analogue signals. There is no way you can feed, say, an audio signal into the joystick port and sample it. - The joystick connector must bring out the main +5V supply. This is very silly, since any shorts will crash the computer at a minimum, and destroy it at worst. Now all that is simply astounding. The right way to do it would be to use any cheap, stand-alone analogue to digital converter chip. Either one with a 2 or 4 way analogue MUX on chip, or use an external MUX. With typical conversion times of less than a hundred microseconds, the matter of non-simultaneity of the x,y samples is not a problem. Whats more, the input could be voltage sensing, ie very low current. This would allow both the input of external signals, and the connection of joystick pots in three wire mode, which is much less subject to problems of wiper resistance noise. Why this was not done is beyond me, except as an example of the conspiracy idea. After all, you don't want people being able to get real world sample data into their computers, do you? * The graphics (lack of) standards. Since the original PC was brought out with pathetic graphics capabilities, there has been a continual torrent of different graphics cards, and methods of controling them. It's not so much the particular designs that are part of the crime here, as the continued non-appearance of one, very good and widely used standard graphics format. As a result there are hundreds, if not thousands of different graphics cards for PCs, and each one of them needs it's own custom video driver software to work best. This has turned out to be one of the most successful 'time wasters' of the whole PC architecture. Not only must software developers spend ages writing drivers for many different cards, and checking compatibility, but users suffer from needing large amounts of storage for all the potential drivers, stuffing around installing the right ones, and the obsolescing of old software as graphics hardware is upgraded and becomes incompatible with the old code. A good machine _must_ have a standard, high quality, fast graphics standard, with at most 3 or 4 display modes. Preferably only one mode, a high res (800x600 or better) pixel mapped, flat memory image, with at least 16 and preferably 24 bits per pixel. The 80x86 Processors -------------------- There are several things seriously wrong with the CPU's that PC's use. The worst of them are:- - Non-relocatable code. - Segmented address registers/memory stucture. - Word/paragraph alignment. Less serious (ie just a plain pain) are:- - Little Endian byte order. - Complex instruction set and non-regular register functionality. For a longer discussion, see TECHNOL\APPENDIX\CPU_86.DOC Flaws in MS DOS (Mess-DOS) -------------------------- * It's not multi-tasking. Now listen here! Multi-tasking systems are EASY to implement. All it takes is a reasonably well thought out basic structure, and hardware with some halfway reasonable memory protection system. Yea.. well I guess that's two reasons why the PC is mono-tasking. DOS and the BIOS are not re-entrant. Groan. * The CPU/DOS interrupts clash. A CPU 'Bound' exception int causes a 'print screen' call?! Astounding. I refuse to believe this one just happened. There is a whole block of these clashes: CPU ints $05 to $1F. See The MS-DOS Encyclopedia, 1988, page 411, 2nd paragraph. * .EXE files, non checking of their checksums. The only possible reason for this is to make sure that viruses are possible. Thanks IBM! This text copied from a Calmer Utilities (NBY) doc file:- | And now for the sad part: (!) | Computer viruses attacking .exe files should never have been | possible. You see, every .exe file carries a checksum with it which | gets calculated by the linker at time of linking the program. The | idea was that DOS, when loading an exe file would first check the | values against the check-sum and refuse to run if there was a | discrepancy. Alas, from DOS 1.00 to current versions, DOS has never | bothered to check, at least none of the versions of DOS that have | ever been released to the public. Naturally, since DOS ignores it, | some third-party linkers ignore the check-sum too which makes it just | about impossible to implement the feature now. Thus, any anti-virus | software checking the check-sum, which would be as easy as ABC, has no | chance of detecting illegal changes to programs..... | And so we battle on. * A similar point may be made regarding the extremely dangerous ability of ANSI.SYS to assign strings to keys, which makes it possible for someone's cute ANSI formatted screen file to reprogram your keyboard so the 'enter' key does a "format c:, Y"! What a usefull feature! Can anyone tell me why there is no command line option for ANSI.SYS to disable key string loading? * The fragile nature of the disk structure. File linking has no redundant info, so if something goes wrong, you lose it all. Anyone with half a brain would have used file blocks with linked pointers, so files could be recovered if root info was lost. So DOS keeps two FAT tables, so what, it doesnt use the 2nd copy, or even check they're the same. It wouldn't have been so hard to implement variable sized blocks with background garbage collection, so small files didn't waste so much disk space, either. * The 'non-swapability' of hard disks. Dos partitions are labelled with their assigned logical drive number, which is bloody stupid. It means you can't swap drive selects on one PC, or move a drive from one PC to another unless 'lower' dives on both machines have the same partitions. As for the 'primary vs extended' partition madness, and BIOS's insistence that all 'primary' partitions must be first in the drive ordering... urgh! The result is the musical chairs of drive letters you get anytime you add a new hard drive to your system - and the consequent chaos with all the stored path names that are now wrong. The right way to do this is for each drive to have a user-assigned _name_, and for these to be used in all path references. Who cares what the physical drive location/decoding is? BIOS setup would then allow setting CMOS RAM with a 'boot from drive name:_____' entry, and all the problems due to imposing some pointless 'drive ordering' would vanish. * The environment. When DOS executes a program, DOS creates a duplicate copy of the data in the 'master' DOS environment, and puts the data in a new environment owned by the program. There are two things wrong with this. Firstly, the copy environment has no free space, so the program can't create environment variables unless it mallocs a bigger new envir copy. Secondly, and most importantly, it is not possible to pass data between different programs using the environment. This is really, really stupid. The only argument I have ever heard in support of passing programs a copy of the envir, instead of just giving them access to the master envir, is that this "prevents programs from corrupting the master environment". Hah! So why not prevent them from accessing the hard disk, cos they might corrupt that too! A large global environment space, accessable (and writable) by all processes, is a crucial feature of a 'good' computer. Sure, I know you can achieve the same effect using disk files, but that's _clunky_! At least this screwup is something you can get around. I wrote some C functions for locating and editing the master environment, and these make a big difference to the general usefullness of code I can write for the PC. * Things like DOS help, file viewers, etc where [ESC] does NOT exit the program instantly. DOS Help needs [ALT], [F], [X]. Incidentally, its also amazing that the DOS Help utility must have the otherwise useless QBASIC.EXE program present. More wasted disk space!:- QBASIC EXE 194,309 30-09-93 6:20a QBASIC HLP 130,881 30-09-93 6:20a * Eight char file names, with three char extensions. Pathetic. The story behind this (thanks Edgar) is as follows:------ Long ago on PDP-11 computers, someone had the idea of compressing text by squeezing 3 characters from a reduced set into a 16 bit word. The cube root of 65536 rounds down to 40, so a 40-character set was chosen - upper case letters, digits, space, and 3 more. The resulting system was called radix-50 because 40 decimal is 50 in octal. Reading dumps became very difficult. Only the most extreme hacker ever learned to decode radix-50 in their heads. To take advantage of the squeezing, system strings were all multiples of 3 characters. Filenames were nine chars long plus a 3 char extension. When CP/M was designed it was heavily based on DEC operating systems. Fortunately the radix-50 stuff was dropped. The 9 char filename became 8 but the 3 char extension remained. -----Isn't that a great horror story? Gives me the willies! * The miserable batch command language. See the general comment on OS scripts later on. This applies to both DOS and WhimDoze. For example:- - 'Internal' DOS commands do not return an errorlevel code if they fail. The MKDIR/MD command just prints an error message to STDOUT and aborts if it fails to create the requested new directory. This makes using it in batch commands problematical. - The 'IF EXIST ' cannot detect directories, and you cannot even use = *.* to detect if _any_ files exist. (Actually, I since learned that you _can_ do this, with something like 'if exist NULL', since the NULL device always exists. See DOS help for exact details. But DOS help was discontinued with Windows 95 up.. Sigh.) - No ability to operate on, test and do I/O on named variables. - No global environment (see above). * I've heard of an ex-Mickeysoft employee who admitted that Mickeysoft deliberately put bugs in DOS upgrades so Lotus software would cease to work. Someday, Mickeysoft will get theirs. I want to be there. * The BIOS. There are some real beauties in here. Some of them are to do with oddities of the documented interrupt calls, while another whole field of problems has to do with what is not present (or at least, not documented). The issue of why some crucial existing calls are not documented is also quite interesting. First, the documented ones. - Int 21/09, Display string. Arg is a pointer to a string. Ah, 'string' as in a zero terminated string? NO! It must be terminated with a '$'. This of course means you can't have the '$' character in the text to be displayed. Now I find this particularly annoying since I am one of those people who prefer to see hexadecimal numbers formatted like '$F3AC'. This is because I cut my programming teeth on Apple IIs long ago, and old habits are hard to change. Besides, '$F3' is easier to parse than '0F3h'. - There are others, but I couldn't be bothered. Then the undocumented ones. - There are lots of these, many of which are essential to the functioning of quite everyday programs. I won't list them here, there are too many, and there are already good books on the subject. See "Undocumented DOS, by Andrew Schulman et al. Pub: Addison Wesley ISBN 0-201-63287-X" For my comments on _why_ these calls are not officially documented, see TECHNOL\APPENDIX\MISTAKES.DOC 'Information hiding'. This is not really relevant to the 'IBM conspiracy' theory, unless IBM and Mickeysoft are acting in collusion. Now that's so hard to imagine, isnt it? However, it is a further example of the ill effects of corporate control of socially important software. * A related subject to undocumented features, is the DOS 'setver' program. The purpose of this is to make DOS lie about it's version number, in specific ways to specific software. The user can tell setver how to lie to given application programs. Besides being a contrary idea, the upshot of it is that DOS has:- - A list of programs that the user is likely to run. - For each, an indication of how old it is (ie the DOS version it wants). Just what you would need if you felt like deliberately causing old software to stop working, in mysterious ways. It seems to me that the only difference between having applications just run regardless of the DOS version number, versus having them check it but be lied to by DOS (besides the added sillyness), is that the second method gives Mickeysoft the oportunity to be devious. Can you imagine Mickeysoft being devious? Flaws in Windows (Whim-doze) ---------------------------- ... We've Infected The Borg With A Deadly Virus... Windows 3.1! These can't be blamed on IBM, or even on Mickeysoft really. The root cause of the poor quality of this software, as in so many other things, stems from our flaky political and legal system's inability to deal with issues such as proprietry rights to intellectual works, in any reasonably sane way. For lengthy discussions elsewhere, see MANIFEST\DATA_OWN.DOC and also POLITICS\POLICIES\PATENT.DOC. Nonetheless, many things in Windows really piss me off! If DOS is Dreadful, Windows is Worse! Generally, I try to avoid using Windows, since even with my fast 386, it still grinds. (Yea, I know, a 386 is ancient, but then I'm sick of upgrading every year, and I sense the end of the Intel empire blowing in the RISCs. Think I'll sit back and skip a processor generation or two.) So I do not consider myself a windows expert user. If anyone knows ways to get around any of these, let me know. With so many complaints about Windows, I call it a 'point and cluck' interface. 1 Opened windows MUST remain inside the 'Prog Manager' window. Cannot even extend outside boundaries. This is an entirely artificial restriction. May stem either from some stupid marketing type's pedantry about whether things can be placed on 'the desktop' or 'wallpaper'. Or maybe it's just so there is another technical difference from the Mac's windows. But practically, WHY NOT allow users to put anything wherever they bloody well like? They paid for the pixels, after all. 2 Icons in Prog Manager don't STAY where you leave them! I want them to NEVER move, less I say so. 3 The scroll bars at right and bottom. Ergonomicly, these are an abortion. To go up/down in text file, need to keep moving the cursor back and forth between the two arrows. Grabbing the position marker to move a little is also very imprecise, due to its varying 'sensitivity' depending on the size of the file, which is very bad. With even moderately large files, Microsoft's method simply does not work. This one is also related to the stupid decision to have only two buttons instead of three buttons on Mickeysoft mice. The ideal scroll bar interface works like this: There are no 'arrow' gadgets on the bar. To go up/down, you just click the left/right button on the scroll bar. The AMOUNT you go up/down depends on where you click on the bar: top=u/d one line, bottom=u/d one page. Thus the movement 'speed' does not depend on the file size. To go to a fixed position in the file, click the middle button on the bar. Then the bar represents the whole file, start at top, end at bottom. No mucking around! The 'current position' marker should still show the position and size of the current screen in the file, but there is no need to ever 'grab' it. Also, in text editors, the u/d scroll bar should be at the left, closer to where the cursor usually is, for less average movement. This leaves the right bar for equally useful functions like 'search up/down for selection'. I know this method is vastly superior to Windows style, since I use both Windows, and an editor that works as I have described. The editor's method wins hands down. 4 Total 'free for all' dumping of all sorts of things in any directory that Windows feels like. Deleting or upgrading Win utilities is a major problem:- finding all related old files & killing them can be very painfull. Installing a utility often adds files to the main Windows directories! Windows encourages creeping disk mould. 5 How to make windows exit AT ONCE, ie no query. In general, I find that a good way to quickly get a feel for how 'clean' a new piece of software is, is to start it up, then see how easy it is to get OUT of it without reading any manuals or help info. Windows is too slow on exit. It's designers didn't seem to consider that anyone might ever (or often) want to leave it. In file PROGMAN.INI there is a '[restrictions] NoClose=1' entry that prevents users from exiting Windows at all, but no entry for 'quick' exit! Typical. It takes too long to start up, too. 6 Deleting an icon in a window does not delete associated files, even if no other references exist to them. So you end up with 'forgotten' files in \win. Is this just stupid design, or did the disk drive manufacturers slip them a few dollars? 7 Icons for folders in program manager window are all the same. Can't edit them, or replace with other icons. Why not? Whats so special about these icons? So they're all folders, so what? This is another 'non-regular' aspect of windows. In a heirarchy of icons, why are ALL levels not treated equally? 8 Wow! Just installed Mickeysoft Visual C++ V1.0 (Really C version 8). Full install would have taken 58 Megabytes! Unbelievable! I hear that Mickeysoft do not use their own C to develop itself. Say no more. 9 Can't grab & move windows that are not 'top', without making them pop to top. Rule: actions should not have side effects! If you want to change window order, should be ONE explicit action to do that. 10 Should be a SINGLE, direct action to close a window, ie ONE click on the close gadget. The other functions now in this gadget should be elsewhere. 11 The window Minimise/Restore/Maximize controls are too complex & duplicated. Control gadgets along top of window, vs menu items, are:- [close]*******top*bar****[down]*[up] =======> ******[UpDown] | | | (Maximize) | \==>Menu: | | | Restore---same- | --- | ----------------------/ Move | | Size | | Minimize--same---/ | This is all silly Maximize--same---------/ Close Next 12 If you have ever needed to swap hard drives and partitions around, and one of them had all your Windows stuff on it, and after backing up all partitions, rearranging and reformatting them, then restoring all drive backups to the new partitions, you will have discovered a fine Windows urk. Windows does not have a global variable for 'the drive that it is on', ie equal to C:\ or D:\ etc. Oh no... All through windows files there are explicit pathnames like E:\WIN\STUPID\---. Great. Delete _everything_ and reinstall from scratch. Lose all personal setups. Redo all of those too. Wonderfull. This is ease of use? 13 In windows, the "About" texts never say what the application is _for_, just what it's called and who wrote it. Sometimes what it's for is not at all obvious. Actually, what's Windows for? 14 In the Control panel - Colour: Custom: Selecting a custom colour box should reset the 'custom colour cursor' and sample patch to the value for that custom colour box. Then one could _adjust_ it, instead of having to guess at it's original colour before re-mixing. Not so important with only 8 bits per pixel graphics, but for 15, 16 or 24 bit cards, its a pain. 15 You can't use "printf()" for debugging C programs in Windows. Interesting. And Win 95 does away with the DOS prompt/command line? Even more interesting. I can see that this trend could easily result in a situation in which it would be impossible to write any software at all unless you'd already mastered the depths of some massive, proprietry (and probably partly suppressed) OS like Windows. Also, in which _only_ development tools by the OS creator company would work (or were available). And also, in which you would'nt know enough of the low level details to be able to do anything 'unusual'! You would also then be firmly locked in the upgrade treadmill. This is a sort of software totalitarianism. A horrifying fate. Also related to Windows:- Mice: Two button mice give me the shits. Why in the name of Gates don't they all have three buttons? I have a Logitech three button mouse and an editor that uses all three. Even though the editor is about eight years old, and has an annoying bug, it still beats most recent programmer's editors, just because of the three mouse buttons (used cleverly). This one is Mickeysoft's fault, but what's their excuse? Hold up your hand, fold down the thumb and little fingers- how many left? Why deprive any of these of it's own button? Maybe some big wheel at Mickeysoft can't move those fingers independently? Do they have webbed fingers? Tentacles? I read that Mickeysoft's new version of Windows, called Chicago, is on beta test, and that its two claims to fame are that: a) it has a 'task switching bar' along the screen bottom, and b) File manager got the toss, replaced with 'Explorer'- a better way of getting at files. Well, I havent seen it yet, but sounds like just more of the same to me. Heard the story of the person who renamed \WIN to \CURTAINS, then deleted it? Operating system command script languages ----------------------------------------- Computers are supposed to be able to do routine tasks automaticly, not require constant human supervision and repetetive command entry. Without a powerful operating system level command script language, a computer is limited to performing actions for which software utilities are available. Without it, users cannot easily set up the machine to perform unique tasks. Note: I am talking only about computer literate users, others are not relevent to this discussion. There are still many without the ability to do anything with computers other than use commercial software, but this does not mean we need to consider them when talking about what more knowledgable people should be able to get their machines to do. In the future it is certain that a greater proportion of people will be computer literate and able to specify computer tasks for themselves. Now DOS, for instance, has a particularly lame and useless script langauge (batch command files). There are so many limitations to this it is not even worth considering as a true script language, but DOS's so called successor (Windows) has NO script language, just a glorified manual operation sequence recorder. There are so many flaws and omissions in DOS batch language, that detailing them would take more time than I care to spend. Just refer to the sections in 'TECHNOLOGY\General OS' for descriptions of what a good OS _should_ do. I am aware that DOS was originally written long ago, and was based on something even more ancient, and that it had to maintain compatibility as it was improved. Nonetheless, the persistent failure to add key features to DOS is outrageous. These lacks are another aspect of PCs that can only be explained by assuming 'gross stupidity' on the designer's part, or alternatively by a conspiracy to hobble public computer use. Miscellaneous gripes -------------------- * Segmented processor - complexity, wasted man hours, barrier to entry. * Ever wondered why the "Print Screen" key has "SysRq" on its side? I found some old IBM doc that babbles on about some operating system no-one ever heard of, that uses this key. Think it was what they originally planned to lumber us with, before giving up on doing their own OS. Then they made the collosal mistake of getting Microsoft to do it (though MSDOS actually was written by Tim Patterson of Seattle Computer Products under the name QDOS - Quick and Dirty Operating System). Ref MS-DOS Encyclopedia, by Microsoft Press, and Accidental Empires, by Robert Cringely. So why do keyboards STILL have the "SysRq" key? Incidentally, the IBM babble seemed to be talking about a multi-tasking OS. Later versions of the PC BIOS call Int15/85 when the 'Alt-Print Screen' key (ie SysRq) is pressed, but it is a 'do-nothing' function, unless user code has hooked that vector. * Can't blame the stupid QWERTY keyboard layout on IBM, yet nonethless, we are still using a key layout that was designed to slow down typists so the letter hammers didn't get tangled up. Refer history of early typewriters. At the very least, the shift key(s) should be moved to below the middle of the space bar, where thumbs could easily use them. Maybe we'll never see sensible (eg Dvorak) keyboards. SCREW convention! Personally, I would love to buy a keyboard with a sensible key layout. Don't mind re-learning to type, even just to get a 'shift' key under my thumbs, not little fingers. Related topics -------------- Apple computers: Having failed to kill off Apple, which once made dangerously advanced hardware and software, steps taken by the 'power base' to lessen the threat were: * Have a totally establishment type appointed as CEO. John Scully was a senior executive of a soft drink company before he took over Apple for heavens sake! * Use management 'plants' like Scully to ensure that Apple stays in the 'high cost, low volume' market. This minimises the number of people that get exposed to superior technology in the first place. Also, direct that Apple technical development staff spend all their time producing numerous slightly different styles and models, rather than put lots of effort into coming up with really new stuff (which they could have, Apple attracted a lot of very bright people in the early days.) * Form a parnership between Apple and IBM, and promote the mutual development of software standards and operating systems. Don't expect any great revolutions out of this development. 1999: The book 'Infinite Loop' (the inside history of Apple) is a great read, but very depressing. Oh, the incompetence of humanity. The Power PC ------------ April 94 I just got a copy of the Motorola 'PowerPC' 601 'RISC' processor chip data book. This stuff is the product of some obscure collaboration between IBM, Apple and Motorola, and it is totally ghastly. I thought RISC meant Reduced Instruction Set, and that this was supposed to imply a degree of simplicity? Not according to IBM! Somehow IBM seems to have convinced normally sane Motorola to implement a version of the IBM 'POWER' architecture, and the result is one complicated mother of all chips, and a 760 page manual. This CPU is undoubtedly fast, but the price we pay is that the bloody thing is so complicated that the idea of learning to use it in any reasonable period of time is out of the question. Kafka would be proud. Big Endian or Little Endian? A hard decision for some, so they implemented both! In the 601 chip, you can actually change this under software control. So now when you examine memory, the data is not just back to front, it's actually in _unknown_ ordering! If this is the next generation of home computing, then I'd say that IBM has now refined their 'excessive complexity' barrier to development and application of computer technology to a very effective block indeed. Hey! It has an instruction called "EIEIO". Someone has a sense of humor. What's also funny, is that as soon as the PC field starts to drop the segmented register architecture (Windows NT uses the 80386 'flat' memory model), then IBM moves to change the 'fashionable' processor to something with a completely different form of 'excess complexity' factor built in. Refer to the Required Attributes section at the beginning of this article. Operating systems: Make sure new operating systems keep getting more and more complex, and need more and more disk & RAM space. Have you seen the Windows Software Developer's package documentation? Amusing items ------------- * Don't you love those LEDs on the front of some PCs, that display the speed in MHz? Do you know they just display two patterns, one for 'normal' speed, and the other for 'turbo'? There are three wires to these displays: Ground, +5V and 'turbo'. The 7-segment patterns to display are selected with jumper blocks behind the LEDs. My PC is unemcumbered with one of these, but if it was, I'd set it to show "HI" or "LO". Code optimization ----------------- I read an article in PC Techniques Jan/Mar 94, on the joys of hand optimizing assembler code for the Intel Pentium. Oh dear.... Any processor that makes this seriously worthwhile (ie is so complex and 'exception ridden' that compilers cannot easily produce the best, or even reasonably optimal code) is _by_definition_ a festering bucket of pus. Complex instruction sets, with obscure side effects, are a major evil in the 'waste of human time' class. Some people seem to get their kicks from performing feats of machine-head acrobatics, doing optimization of awful processor code like for the Intel 80x86/Pentium chips. These people are just displaying their inability to think at higher levels, where they should be considering if what they're doing is really worthwhile in any wider sense, or whether they are just doing a hamster's treadmill performance. The only real effect of all their effort to wring speed out of such ghastly processors, is to help perpetuate the evil of those chip's existence. You will often hear people alternately proclaiming the benefits of languages like C++ for increased portability and re-usability of code, then dropping into discussions of the minutae of hand optimized assembler code to speed up said C++ programs. Err.. guys, seems to be a bit of a contradiction there. What's the usual lifetime of a processor generation? About 2 years? For this, you would fill your head with such details? Life's too short. It's interesting though. I seem to have been one of very few people who disliked PCs and especially Windows from the first days of their introduction. Till recently, almost everyone seems to have worshipped the bloody things. However, more and more often now I hear comments about PCs like "Oh God, how do we get out of this mess?" Hopefully, you are reading this rant as a part of my Everist Manifesto work. One strand of that is a detailed plan of escape from the madness of 'design by corporate marketing' - and how to produce a sensible computer architecture. Summary ------- The existence and nature of the PC (or rather the non-availability of anything better) demonstrate two unpleasant components of human nature. The first of these is the tendency of powerful people to act exclusivly in their own interests, without any concern at all for the wider, long term effects on human society. Hence we have the creation of such a limited, 'progress inhibiting' machine in the first place. Secondly, we see the pliant way in which most people will aid and perpetuate any system with some perceived social momentum, without questioning the nature and intrinsic worth of the structure. Thus we have the existing massive and expanding industry with the rotten PC at it's heart. Well, the bigger they are, the harder they fall.