Ok, so I've been trying to figure out my fualt in this code for probably a week now.
Hopefully somebody will see what I couldn't see.
Anyways, any help would be greatly appreciated. Lets begin.
void serverStatus( const char* data )
{
char* s ;
int i, l = 0 ;
char* tok ;
int tokcount ;
hash_t srv[32] ;
bzero(srv, sizeof(srv));
s = strdup( data+20 );
strcpy(s, data+20);
// Tokenize
for(i = 0, tok = strtok( s, "\\"); tok; tok = strtok( NULL, "\\"))
{
l = ( l ? 0 : 1 );
if( l ) // if its a variable
strcpy( srv.key, tok );
else // if its a value
{
strcpy( srv.val, tok );
++i;
}
}
// Dump server info
tokcount = i;
i = 0;
while( i < tokcount )
{
if ( strncmp( srv.key, SRV_HOSTNAME, strlen(SRV_HOSTNAME) ) == 0 )
{
char* tmp = sanitizeString(srv.val);
printf( "%s %s\n", srv.key, srv.val );
free(tmp);
}
else if ( strncmp( srv.key, SRV_MAPNAME, strlen(SRV_MAPNAME) ) == 0 )
{
printf( "%s %s\n", srv.key, srv.val );
}
++i;
}
}
This is my code in question. Although I understand the problem could be prior in my network handling. ( we can explore more on this later. )
So what I know is that the code 'generally', if it does at all, crashes in the while loop.
Freeing my pointer 's' always crashes, ALWAYS. this bothers me a bit.
finally when i do print values, sometimes it prints stuff that isn't supposed to be there. ( ie memory issue )
the hash_t type looks as follows.
typedef struct info_s {
char key[256];
char val[256];
} hash_t;
none of the values that we tokenize are larger than 255 characters.
Lastly full code is accessible if you would like to see it crash in real time.
( qquery.googlecode.com )
Thanks much for any insight you can provide.
~vroemer