message to comp.ai.neural-nets
- Hi folks,
I sent these messages today (I think) to the newsgroup,
but thought maybe someone here might have an idea. I
want to train/test an ANN using the same idea of an
objective function in GPs. Here's the text:
From: Mike <mikee@...>
Subject: Re: training over a set of data?
In article <viq4q5gfn6cr64@...>, Mike wrote:
> In the literature I have read the documents discuss havingHad another thought. I don't think I'm explaining this too
> a data file of the inputs and the expected output for
> each line of input and training the ANN over that set.
> Supply the inputs to the net, examine the difference between
> the ANN's output and the desired output, then back propogate
> the error through the ANN. What I have is a data set of
> inputs that I want to pass through the ANN, monitoring the
> outputs and then somehow averaging (or some other function)
> the outputs to calculate the error for the entire data set,
> then passing that error through back propogation. What I'm
> thinking of is the same approach used in genetic programming
> (the objective function).
> Has anyone done this?
well. The idea is not to test and input to each output, but
to test all inputs against a general idea. Take the realm of
measuring temperatures. As inputs feed individually the
temperature of each previous day, two days ago, and three
days ago (three inputs) to the ANN and look for the output
temperature of today. These four values are always at the
same scale, so no problem there. Instead of looking at the
three previous day's temperatures WRT today's temperature,
run a month of days and take the average of how good the
ANN is, then use the some error value (the Euclidian distance?)
as the training value for that month as a dataset. Repeat
until the output is good enough.
Is that more clear?