because of the fact that i use only very short dna-string, the 200k-data limit is enough... (this is for the movement...as for the gun, i'll use an improved version of nihils gun)
the movement. the basic principle : check in which situation you are in. create a random movement pattern. execute it. if it works : cool. if it works several times : very cool. let it mutate. now we have several movement patterns for this situation (mother and daughters), to fool some pattern matchers. if the pattern fails, kill it. i've decided to use a maximum number of situations that can be recorded, because otherwise there would be billions of possibilities (...). if the maximum limit is reached, a new pattern (DNA-string) can only be created if another fails. like on earth^^. this should end up in a stable set of dna-strings that work against a specific robot, after some hundred rounds of fighting. if the dna-memory is full, and gaia gets into an unknown situation, it tries to remember the most similar one. this is done by calculating the "distance" to another situation. distance means, the situation (the sector you are in, the distance to the enemy etc.) is interpretated as a n-dimensional vector. (every situation is considered as a vector, or a point in n-dimensional space, so you can calculate the nearest known point to the one you are in)
after every round, gaia checks for similar dna-strings/situations and combine them into 1, to save memory. for example, a pattern works for every sector, if the distance to the enemy bot is xxx or lower. then it deletes all situations from the memory except one, and put the weighting-factor of "sector" to 0 or a low value. this will make sure that next time the distance is xxx or lower, the pattern that worked will be considered as "the nearest situations pattern". this is, btw, the way a human brain works, i think.
doesn't situations need a lot of memory ? no, they don't. :P i use i simple integer to save it. 4 bits to specify the sector gaia is in, another 3 to specifiy the distance-level to the enemy and so on. just 4 bytes per situation is more than enough. this way, i can save everything in a hashmap,using the situations integer-value as a key. cool, isn't it ?
some more details without context : i divided the battlefield into several sectors (corners, middle, near the walls). 0-100 is distancelevel 0, 100-250 is 1, 250-500 is 2, and so on... i also save the number of bullets that are still on the fly, and some other things.
now, isn't this freaky ? in theory, gaia should be able to avoid about every bullet, if the dna-string-language is able to provide the needed functions.
now the gun : i'm using a bullet ring, only saving the center and radius. i'm also firing 13 virtual bullets, but not calculating them before the ring touches the enemy. then, i check which bullets would have hit. if a vb hits, fine, use this method from now on in this specific situation (situations are mostly similar to the ones i used for the movement). if not, check the angle, and use this angle from now on (this is mostly like guess factor targeting).
comments are welcome.
maybe the text is difficult to understand, but just read it several times...
the newest version of gaia (0.01) is already able to change it's colors :) at the moment, i'm developing the dna-language...should be done this week...
i'll keep you up to date about my newest creation here, so come here from time to time.
-> my next step will be : write a lot of neuron-classes that provide all necessary functions like setahead setturnright etc.
i'm going to sleep now. HoD
As for my splurge of relevancy... This idea sounds very good. However, the problem seems to me that there are many different guns out there; for a movement system to be powerful it has to be able to defend against all of these guns. It would seem very difficult for an evolving DNA neuron sequence to adapt to all gunning systems without understanding their functionality. SandboxDT's movement system for example mathematically folds the walls into it's movement, and avoids getting cornered, all the while normalizing it's movement curve. It seems difficult to imagine an evolved movement system to be able to compete with such a well handcrafted system.
Not to say that this idea won't be totally cool; I've thought a lot about an idea like this before. The difference is that in my idea the bot learned from other bots' movement systems rather than creating it's own. For example if you fought it a billion rounds agaisnt Walls, it would move exactly like Walls; if you fought it a billion rounds against Spinbot, it would move exactly like Spinbot. If you fight both, it will learn which movement works best against which enemy. The problem was that I couldn't really wrap my head around programming a pattern analyser that matched things like randomness. Anything beyond the likes of the sample bots were too complicated for me to even begin imagining how to design such an analyser. Hence I never really took the idea off the ground.
However, a genetically evolved movement system is different, and it's a subject I'm extremely interested in; I'm glad your trying it out. Good luck! :-) -- Vuen
edit:
i thought about using neural targeting. there are some bots out that do that already, but i don't understand how they do it. they have a nn, they give it some input...but what the hell happens then ? in my case, i have to write neurons that provide some basic functions. how does a nn work without this ? what are they doing with the input ? i can't believe they develop circular aiming or something like that just by getting some input and having no idea what to to... HoD
Feedforward multilayer neural networks (like the kind used in NRLIBJ) are a very specific mathematical concept. Each node sums its inputs, applies a limiting function, and then outputs the same value over its output connections, which multiply the value by some weight before arriving at the next node. The way that the network actually learns is through a complex algorithm called back-propagation, which computes the error at the output layer, and then figures out which nodes were responsible for that error and adjusts their output weights. Using this learning algorithm, this kind of neural network can map any continuous function, linear or non-linear. If you'd like more information, a Google for "back-propagation" or "feedforward multilayer neural network" should return something useful. -- nano
Not at all, I was attempting to answer your questions. I hope I didn't offend you. -- nano
ScruchiPu is open source. You can take a look into it (also check ScruchiPu and NeuralTargeting pages). You will see that it has a velocity and a heading change vector as inputs, and the outputs of the network are the predicted velocity and heading change for the next tick. So it aims iterating on the NN (use the predicted headings and velocities to guess the enemy movement). -- Albert
as i said, i thought about using neural aiming. i have no hope that there is a way to make a nn develop guessfactortargeting by learning. i'm trying to do it another way : a child, that has never learned anything with vectors, accelerated movement etc. is able to learn to catch flying balls. even a dog can. i'm trying to use this fact. my nn doesn't need to know the exact formula (i'll give it some basics like headon-fire, iterative targeting etc. to make it not being a fool at the beginning, but this won't be the trump card...). it doesn't need to calculate the exact goal of a bot. for simple ones, it has it's basic neurons that i'll steal from Nihil. what my exact plan is, i don't know. i need a good idea. ^^ HoD
Glad you're still around; the wiki has been a little slow lately and I've been looking for something Robocode-ish to chat about. When will we be able to see a working version of Gaia? -- Vuen
+ nn completely implemented.
+ i decided to give gaia 8 control-neurons for the movement. up, down, left, right, and move diagonal. it will drive a minimum amount of 40 pixels, and it will stop just before a linear-iteration aimed bullet would hit, or when it would hit a wall by driving 20 more pixels, or just before the enemy is able to shoot again (this will avoid incomplete moves (interrupted by another shot of the enemy), which are difficult to rate), or randomly on the way. i think (hope) this will end up in a strong movement, and even guessfactor targeting won't help much. maybe i'll use another 8 neurons for the same directions, but different movement styles.
+ some (4) guns implemented. based on nihil's gun, but improved, partly a bit, partly a lot (especially the pattern matcher kicks ass now)
still to do : - convert the situation gaia is in to an array of floats to feed the nn. this will be the most difficult part. what does the bot need to know to beat the enemy ?
- use a second nn for choosing the right gun (this will be gaia's greatest strengh, i think. it will be able to choose the best gun depending on the situation classified by a nn, not by distance alone)
performance until now : + kills every samplebot without moving (if using the correct gun) + kills patternbot without moving *hrhr*
ps : the code is very structured this time...looks like i finally learned how to write really big programs without screwing everything up *happy*
for the gun-choosing, i'll add the speeding up/slowing down-number.. HoD
edit (just to let you know) : i changed the nn-usage a bit. i'm using a modified (improved & more random movements inside & fuzzy little randomizations) version of nihil's antigravity movement as a standard movement, because it works in the most situations. the nn will take control of gaia in the situations where the antigravity-movement fails. this way, the overall performance and memory usage should be better. HoD