Not sure if this is the right word. I remember it from my Chemistry classes aeons ago.
Anyway, it is the amount of unpredictability of an opponents movement behaviour. I am trying to think of a way to determine this Entropy in a segment, over multiple segments or even the complete array of segments. The latter may give an indication of how good a bots movement is.
--Vic
In decision tree learning, there is actually a mathematical definition for entropy that can be used to rate potential "next segmentations" based on how much more information you can gain (i.e. - lowering the entropy for the most samples is high information gain). -- Kawigi
Hi Kawigi, nice to hear from you again! Can you point me in the right direction on this mathematical definition? You wouldn't by pure chance have this formula adapted to rate a GF segment, would you? :-). Mind you, I'm not a math wizard... --Vic
- I googled a bit and found this paper that even I, being mathematically challenged, can understand: https://www.cs.cornell.edu/home/cardie/tutorial/dtrees.pdf . It sure looks interesting and covers a lot of what I have been thinking about for my gun. But I think it will take some time before I can wrap my brain around the theory and actually use it in practise. --Vic
- Hmmm... Actually, when I originally learned about Decision Trees for a class at school, I spent about half the time musing about its applicability to GuessFactorTargeting with Jamougha somewhere on the wiki... let's see if I can find it... Ah, right. Look at Segmentation/Prioritizing - you can get the benefit of my general explanation, Jam and I arguing and musing, as well as code to find entropy and information gain numbers. -- Kawigi
- Awesome!!!! That helps a lot, thanx. --Vic