Re: Microcomputing vs. Microcomputer
- "B. Degnan" <billdeg@...> wrote:
> I know what you're saying, but a processor does not a microcomputingsystem
> make.Bill, you can make your own definitions for terms as suits your
> What is my definition of microcomputing?
> - single user system that can be operated by just one person
> - system can be changed/adapted
> - programming / configuration changes are implemented
> in real time (instantly)
> "One person / One computer"
interests, and more power to you. But how a technical term was used at
a point in time, can be demonstrated by reading the literaure of the
period and looking for a consensus of use AT THAT TIME.
Now Bill, I originally wrote down here, a lecture about history,
language, and microcomputer use. And I had some selected comments
about when Ted Nelson said this or that, and so on. But, hey, I have a
big DELETE key and I used it. Here's the short story.
The term "microcomputer" is specific to one or two classes of digital
hardware during various periods in time - check the literature of the
70's for details, as per a recent post of mine in MARCH.
"Microcomputing", to my general knowledge of the period which is
considerable, generally referred to doing stuff on or with
microcomputers beyond simply running a canned program. I'm not aware
of a consensus definition for this term at that time, or any other.
Your definition covers some uses of some microcomputers, but misses
other uses. But your definition is also so diffuse that it covers many
activites which are not related to "microcomputers", and during times
before the term "microcomputer" was in use in the technical or trade
Therefore, in my opinion, I suggest you associate another word with
your definition, to avoid historic confusion.
Bill, what you do with your definition of THAT KIND of computing is up
to you, and it's a fine, wonderful definition. But please, use another
word with less historic context. Your statement that "historians will
move away from hardware" does not negate the OBVIOUS, and prior,
association of the term "microcomputing" with "microcomputers".
If you can't find an appropriate word in the "personal computing"
literature, THEN you might coin your own word, but be prepared to show
why YOUR word is better. And you can't say "I don't care about the
historic use of the word" - not if you are also teaching THAT SAME
If you want an argument about this, I have saved a copy of my
But don't get me wrong - you are on to something, in my opinion. In
support, as you know I am making the case for the importance of CP/M
as a programming environment which EARLY ON implemented CLOSE TO what
you are talking about in your definition. CP/M was a critical
development in early ...personal computers... which led to the
creation of a large base of software, many users, many producers of
computers. And all of them were COMPATIBLE at an important level,
because of CP/M. It is no accident that IN PART because the IBM PC's
OS had features and tools in common with CP/M-80, that computer became
Therefore, Bill, your definition above is of value. Just use another
WORD for it!
Herbert R. Johnson, New Jersey USA
http://www.retrotechnology.com/herbs_stuff/ web site
http://www.retrotechnology.net/herbs_stuff/ domain mirror
my email address: hjohnson AAT retrotechnology DOTT com
if no reply, try in a few days: herbjohnson ATT comcast DOTT net
"Herb's Stuff": old Mac, SGI, 8-inch floppy drives
S-100 IMSAI Altair computers, docs, by "Dr. S-100"
- Sellam Ismail wrote:
> On Mon, 8 Oct 2007, B Degnan wrote:May the force be with you!
>> I was looking for a Greek word too. micro = small in Greek (obviously),
>> but that's not really what I am going for. I am very thankful for the
>> considerate responses and they have me thinking. A lot of the "hacker"
>> mentality about computing should be represented somehow. I will work on