LETTERS

Pessimistic Portrayal in Programming Paradigms

Dear DDJ,

I was intrigued by your [Mike Swaine's] worries about the U.S. software industry as expressed in "Programming Paradigms" (August 1988). I believe you are unduly pessimistic.

About 10 years ago, I learned that the British arm of Burroughs, as it was then, delegated a substantial part of its programming to India. There is the benefit that English is a common language, not only between Indians and ourselves, but also within India where there are many other languages besides the majority Hindi. There too is a tradition of scientific and technical studies.

The point is this: I do not think the U.S. computer industry's use of Indian labor was a very great secret -- at the time I was not working for an especially favored customer -- and certainly would not be now. Yet, there is no evidence of other firms following the Burroughs example; I don't even know if Unisys continues the practice. There are great shortages of expertise across a wide range of ADP applications. Salaries are sky high, and even middle-aged programmers like me can still make a living. Closer to home is the Republic of Ireland, dedicated to giving technical skills to one of the youngest populations in Europe. Some recruitment into the U.K. has occurred, but the great off-shore software industry just has not appeared.

I think that employers will always want people carrying out critical work to be on-site. It is significant that Burroughs' Indian programmers were doing largely program conversion -- work that had to be carried out to a high standard of accuracy and to a tight deadline, but routine and not requiring a high degree of interaction.

Just as people in business do not like paying for anything that they cannot drop on their foot, they do not like employing people who are not there to kiss the same. Cheap joke, but you know what I mean. They feel uncomfortable without physical evidence of payroll.

An interesting manifestation of a similar phenomenon occurred here recently. You must know that we had a postal strike. I, and probably many other technically aware people, expected a large increase in traffic on Telecom Gold and smaller communications networks. Not at all. The real boom was in facsimile machines. Even now, none of the large computer recruitment agencies I use has a communications link, in spite of the fact that the capital and running costs are lower than fax. They all have fax. The physical appearance of a document, even a poor copy, is more important than the content.

Many of us are trying to persuade employers in areas where recruitment is difficult to employ us as "telecommuters." We work from home, don't have to travel long distances (in U.K. terms), and need not be distracted by office politics. In return, the employers pay substantially less than their local going rate and do not have to provide accommodation, furniture, or equipment apart from a terminal and a communications link. However, the concept has not spread beyond a core of specialized areas.

You may ask why I am sending you a letter, rather than using electronic mail. I would reply that I have had to move my computers to another part of the house, it is too much trouble to fiddle around with cables and modems at the moment, that it is more trouble to look up your BBS identity than your business address...but I don't convince even myself. Atavism rules.

Frank Little

Clydach, Swansea

Blind to His Love's Faults?

Dear DDJ,

After some thought, I finally decided to write about a key issue left unspoken in most discussion of "the new Basics." Bruce Tonkin's review of the Microsoft and Borland offerings of Quick and Turbo Basic ("Inserting Elements into a Basic Integer Array", November 1988) is a good case in point. After reading pages of benchmarks and comments about the quality of editors and so on, I still wondered "how large is the data segment?" In a program written last year in QuickBasic 3.0, I ran up against the 64K data limit -- a limit of the medium memory model. I wondered if this had been changed in QB4, the new Basic compiler or in Turbo Basic. A re-reading of the article provided no further clues.

A call to Microsoft straightened me out. QB4 and BASCOM, Version 6, continue to use the medium memory model under DOS, even in the OS/2 world!

Of course, there are workarounds for the new Basics. Microsoft offers large numeric arrays, which are a help if your data is numeric and if you don't mind the execution speed penalty over using the heap in other languages. (Kent Porter suggests this works out to about a 500 percent penalty.) Although addressing limitations of the 8088/80x86 requires workarounds in Pascal and C, they exact a more serious penalty on Basic. For many applications, this limitation essentially prevents a new Basic from being considered the platform. It is important for this information to be given and put in context in a review aimed at software developers.

I always read Bruce Tonkin's articles and enjoy his often eloquent defense of Basic's honor in an apparently hostile professional world. I am beginning to wonder, though, if he is blind to his love's faults.

Jay van Santen

Topeka, Kansas

Bruce replies:

I think I'd better confess that I'm confused by Mr. van Santen's letter. I did know that Microsoft used the medium memory model for both Basic 6.0 and QuickBasic 4.0. Microsoft could have used the huge memory model, but that would have made for slower data access and a larger amount of generated code in most programs.

If Jay meant "Microsoft should have allowed a choice of memory model," I agree, but it would have meant a more complex compiler and perhaps a less-easy and less-general interface to other languages.

Large arrays are not limited to numeric data; both QB4 and Basic 6 permit arrays of static strings and user-defined types of any size, limited by memory. If he'd like variable-length string arrays larger than a single segment, I agree. I've been asking for that for years.

I don't think he meant code size is limited. The medium-memory model allows code up to a megabyte.

There is a bug with arrays of static strings and user-defined types for QB 4 and Basic 6.0. If the element lengths are not a power of two, the maximum memory available to the array is about 128K.

Though data on the heap can be accessed faster, I've found Basic's string functions faster, more flexible, and easier to use than those of Pascal or C. Often, that more than makes up for any theoretical losses. For numeric arrays, please note: The alternate math package in Basic 6.0 is the same one used by the Microsoft C compiler.

Don't overlook the fact that large arrays are easy in QB or Basic 6.0, and not always so in C or Pascal. If you have to do a little kludging, it seems to me that you'd lose at least some performance and (more important) code clarity.

I'd never concede that QB or Turbo Basic is a poor development platform compared to Pascal. There are a few jobs where C is better than Basic, and jobs where assembler is better than either. Where Pascal might be better, I've always found that C is better still. If time permits, assembler is always the best choice in those same cases.

If I wrote different software, maybe I'd use C. I've had to write few device drivers or systems software. Instead, I've been writing database management tools, a word processor, and business programs. For those, Basic is and has been a perfect choice. The medium memory model has been no constraint to me or anyone I know.

I find it impossible to believe any of the applications mentioned could have been more quickly written or debugged in C or Pascal. The small parts where speed proved most critical were identified and then coded in assembler, anyway, so I doubt the C version would show any better performance.

Mohr's Flames

Dear DDJ,

I have been a subscriber for years, and I like the magazine. I am a little put off, though, by the endless PC stuff. I was forced to use one at work and found it and its software abysmal. Aztec C had a number of errors in it, and even though I was supposed to get updates of bug fixes, none ever came.

Flame One: Why is the programmer supposed to pick the large or small, code/data model? Is this not a job suitable for the compiler and computer to figure out? The entire Intel line beyond the 8080 seems to conglomerate the problems instead of solve them. Maybe I am just stupid, but Motorola's 68000 series is much better designed.

Flame Two: I responded to an advertisement for the C club in Kansas, which had lots of public domain (so they say) software. Well, that is a fallacy; it is public domain as long as you own the BDS C compiler and library. Somehow I do not think that is what public domain means. Am I naive or stupid in this matter?

Flame Three: I have used a simple but powerful editor on a DEC 11/20, which is a 64K machine called TECO, Version 28. It would (and could) do more than any other editor I've seen for small (64K) or even ten times bigger (640K) IBM PCs. EMACs, however, is an exception. Why is there no TECO for CPM machines? We seem to have turned to a swamp of fancy junk editors rather than maintaining a simple powerful editor like TECO.

End of Flames: I close with the wish that IBM and its clones drown in the swamp of their own making, quietly and without fuss. Intel has pioneered the lowest common denominator of computing hardware, and IBM, by farming it out, has indeed pioneered the same in software for its machines -- a software environment where wild cards sometimes work, sometimes not. Nothing is consistent there. I have spent a lot of time using Unix and C, and the comparison to the Intel/IBM systems show the latter to be sorely lacking in both hardware and software. I would hope that Dr. Dobb's would lead the way instead of just playing in the swamp. I eagerly await my NEXT computer, as it seems like the first real step up from CPM.

Douglas Mohr

Boulder, Colo.


Copyright © 1989, Dr. Dobb's Journal