[luau] the evil # 1460

Charles Lockhart lockhart at jeans.ifa.hawaii.edu
Thu Apr 25 18:30:42 PDT 2002


I have this client-server app going on, the server side written in C running
on an embedded linux device using the 2.2 kernel, the client side written in
Java.

for the server side, I have (paraphrased, sorry):

            #define CLK_BUF_SIZE    (128 * 1024)
            int bytes_recvd, bytes_total = 0;

            while(bytes_total < data_len)
            {
                bytes_recvd = read(sockfd, &data[bytes_total],
CLK_BUF_SIZE);
                if(bytes_recvd < 0)
                {
                    printf("error receiving data\n");
                    break;
                }
                bytes_total += bytes_recvd;
            }

on the client side, I have (again, paraphrased, sorry):

    byte[] data_buf = new byte[128 * 1024];

... some code filling data_buf with data_len bytes ...

    out.write(data_buf, 0, data_len);

Ok, running my client on Linux I have no problems.  Running my client on
Win98/Win2k, and if  data_len is greater than about 2622 bytes, the
out.write() from the client ends up being seen as two seperate  reads on the
server, the first always being 1460 bytes, the second being the remainder,
and I get a bunch of 0's pushed into the data as well (gee, would you
believe 1460 of them?), and then the data begins where it left off,
essentially cutting off the last 1460 bytes of my data.

Anyway, I'm totally not understanding what's going on with this thing.  If I
send 2500 bytes, everything is fine, if I send 2622 bytes, everything is
fine, if I send 2624 bytes, I start getting wierd behaviors.


-Charles




More information about the LUAU mailing list