- Does learning a Father's trade count as higher learning. For instance,

learning all one need to know about metal and the effects of fire and heat

on it to become a good blacksmith. Even the nobility were learning to take

over the family business (even if that business was being King)

On Thu, Aug 27, 2009 at 10:37 AM, annsaw3712 <annsaw3712@...> wrote:

>

>

> I've always wondered about schooling in those far off times. . . .Was

> learning only for the Noble class? Where there places of Higher learning?

> What was the deal?

>

> Also did the common people even care about knowledge of anything?

>

>

>

--

Love and Blessings,

Ron Osceola, CHT

http://groups.yahoo.com/group/bearintuitions

804.385.0485

[Non-text portions of this message have been removed] - On Thu, 2009-08-27 at 19:50 -0400, bronwynmgn@... wrote:
> Village officials who couldn't write or read could none the less do

The knowledge of mathematics also depended on cultural context. For

> enough math, and use marks on a tally stick to keep track of the grain

> and animals produced on the manor and render an exact account to the

> lord several times a year - and be able to figure money well enough to

> determine whether he owed the lord money or the other way around, and

> exactly how much money.

example, the Arab cultures invented much of our numbering system and

higher mathematics, including a new mathematic called "al-gabr" (named

after the treatise "Al-Kitāb al-mukhtaṣar fī hīsāb al-ğabr

wa’l-muqābala" (Arabic for "The Compendious Book on Calculation by

Completion and Balancing"). This was written by Muhammad ibn Mūsā

al-Khwārizmī, a Persian mathematician circa 820 CE.

The "al-gabr" is not, as I once thought, part of the mathematician's

name, but rather is the name of the mathematical operation of moving an

equation term across the equal sign while negating it, for instance:

x + 20 = 3x (original equation)

x = 3x - 20 (al-gabr step)

-2x = -20 (al-muqabala step)

x = 10 (dividing both sides by -2 to get the answer)

The notion of symbolic logic is an elegant leap of intellect. Early

mathematical systems could perform concrete calculations but could not

express abstract relationships between quantities. According to

Wikipedia, the roots of algebra go back to the Babylonians. However,

during the intervening times not all cultures had the abstraction

concepts. Even today, there are people who can do computations very well

but whose brains just don't grasp abstract symbolic logic -- in the same

way that *my* brain doesn't grasp music or artistic creativity. :-)

We're all wired differently.

(I have an interest in this topic because my persona traded his

inheritance rights to his father's estate for tuition to study al-jabr

and astronomy in the Arab lands during one of the brief intervals when

we Byzantines weren't at war with them.)

Another important mathematical innovation that was not culturally

universal was the notion of place value. The Roman numeral system, for

example, has only a primitive left-or-right concept of place value. VI

means six, and IV means four, but they didn't have the concept of a base

number such as our decimal system. Computations of large quantities are

extremely cumbersome without a place-value (radix) system. Again, the

Babylonians had this, but later cultures like the Romans often did not.

The other often-overlooked mathematical breakthrough was the concept of

zero, as a number like any other rather than as the absence of a number.

The notion that you could use a symbol for "nothing" as part of a

calculation dates back to 9th century India, though earlier cultures

(including, once again, the Babylonians) had concepts that *almost* got

there. The key concept is that zero is a number like any other, that can

be included in calculations to generalize mathematical rules.

As an interesting side note, although modern computers treat zero as

just another number, we have had to go back to the medieval concept of a

different symbol to represent a placeholder for "something that should

have been a number but isn't". For example, when a computer program

tries to calculate 35/0, this produces an error. However, even though an

error message might be issued to a log or displayed to the user, you

still have to put *something* into the memory slot for the answer. The

Institute of Electrical and Electronic Engineers (IEEE) defined a

standard that includes special "NaN" (Not a Number) symbols that can be

used for situations like this. Essentially, they mark a memory location

as containing "invalid data" so that later calculations won't rely on

these data items as being real numbers.

The difference between modern and early medieval thought is that we

treat zero as a number but retain the concept of a placeholder for

things that truly *aren't* numbers, such as the result of division by

zero. In early medieval times, zero was thought of as being somehow not

a real number, because you couldn't have "something" that represented

"nothing". Again, it is a leap in intellect to understand the difference

between the symbol for a number and the abstract concept of "number".

Very interesting thread -- thanks to our resident historical scholars

for some very enlightening posts!

Justin

--

()xxxx[]::::::::::::::::::> <::::::::::::::::::[]xxxx()

Maistor Justinos Tekton called Justin (Scott Courtney)

Gules, on a bezant a fleam sable and on a chief dovetailed Or two keys

fesswise reversed sable.

justin@... http://4th.com/sca/justin/