Compare commits

5 Commits

Author SHA1 Message Date
Lewis Van Winkle
4f72209510 increased training loops to 500 2020-12-17 08:27:55 -06:00
Lewis Van Winkle
07b29f936c Merge pull request #44 from timgates42/bugfix_typo_optimization
docs: fix simple typo, optimizion -> optimization
2020-10-10 10:09:22 -05:00
Tim Gates
8ac633de97 docs: fix simple typo, optimizion -> optimization
There is a small typo in README.md.

Should read `optimization` rather than `optimizion`.
2020-10-10 17:24:47 +11:00
Lewis Van Winkle
5e147c7e3f Merge pull request #43 from codeplea/doc_c99
update doc to specify C99
2020-10-09 09:53:38 -05:00
Lewis Van Winkle
f6c22401d2 update doc to specify C99 2020-10-09 09:52:57 -05:00
2 changed files with 3 additions and 3 deletions

View File

@@ -11,7 +11,7 @@ functions and little extra.
## Features
- **ANSI C with no dependencies**.
- **C99 with no dependencies**.
- Contained in a single source code and header file.
- Simple.
- Fast and thread-safe.
@@ -105,7 +105,7 @@ backpropogation.
A primary design goal of Genann was to store all the network weights in one
contigious block of memory. This makes it easy and efficient to train the
network weights using direct-search numeric optimizion algorthims,
network weights using direct-search numeric optimization algorthims,
such as [Hill Climbing](https://en.wikipedia.org/wiki/Hill_climbing),
[the Genetic Algorithm](https://en.wikipedia.org/wiki/Genetic_algorithm), [Simulated
Annealing](https://en.wikipedia.org/wiki/Simulated_annealing), etc.

View File

@@ -23,7 +23,7 @@ int main(int argc, char *argv[])
genann *ann = genann_init(2, 1, 2, 1);
/* Train on the four labeled data points many times. */
for (i = 0; i < 300; ++i) {
for (i = 0; i < 500; ++i) {
genann_train(ann, input[0], output + 0, 3);
genann_train(ann, input[1], output + 1, 3);
genann_train(ann, input[2], output + 2, 3);