Java sockets question

edited April 2004 in Internet & Media
I'm writing code that implements the selective repeat protocol. Well to make a long story short. We used the datagramsocket class that has been modified a little to corrupt/lose some packets. my program works fine if there is a packet loss or the packets send successfully. But when the packets are corrupt it puts the program through hell. I have no idea why this may be, in my code all I have it do is ignore the corrupt packet, and for some reason when it goes back up to receive another packet it gets the same packet and automatically says the packet is corrupt pretty much. the checksums are always the same even if the datagram socket class says it sent the packet without corrupting it or anything. Here is the code i'm using, the format for the data is

short seqnum
variable data
short seqnum

Once I get this bug out of my code it should work fine, i just cannot for the life of me understand why when it gets a corrupt packet it goes haywire. Only thing I can think of is the corrupt packet somehow alters the datagram packet to a different length etc which would throw off my checksum calculations. But i have it make a new instance of a datagram packet every time it loops. Any java experts want to tell me where my code could be going wrong? Thanks

-Push
Sign In or Register to comment.