Cygwin script log redirection not working well

I have a simple script which will send a curl request and redirect the output to a log file.

for i in {1..20}
do
 curl google.com -is >>log.log &
 echo "request # $i" >> log.log
done

After it completes the execution, if I run the following command I should see 20 lines because I am printing it in the loop.

cat log.log|grep request|wc -l

But I see 1 or 2 lines and all my log is messed up. I know that I am sending curl command to the background so the log will be mixed up. But I should see 20 "request" when I grep the log.

The same script works fine in my ubuntu. Also it works fine in my friends cygwin. I uninstalled cygwin and installed it back but no change. I guess it is something with my windows configuration? Anyone know why it is happening?

My messed up log

request # 1
HTTP/1.1 301HTTP/1.1 301HTTP/1.1 301HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Fri, 02 Nov 2012 14:22:32 GMT
Expires: Sun, 02 Dec 2012 14:22:32 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.googHTTP/1.1 301 Moved PermaHTTP/1.1 301HTTP/1.1 301HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Fri, 02 Nov 2012 14:22:32 GMT
Expires: Sun, 02 Dec 2012 14:22:32 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.googHTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Fri, 02 Nov 2012 14:22:32 GMT
Expires: Sun, 02 Dec 2012 14:22:32 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">hHTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Fri, 02 Nov 2012 14:22:32 GMT
Expires: Sun, 02 Dec 2012 14:22:32 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.gHTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Fri, 02 Nov 2012 14:22:32 GMT
Expires: Sun, 02 Dec 2012 14:22:32 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN

<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">here</A>.
</BODY></HTML>

You are running curl in the background. It runs independently -- ergo, it prints when it pleases. Whether it will print lines precisely when you want it to is entirely coincidental. It could even print overtop of other lines sometimes but probably not all the time.

This is not a bug. This is just because you have no control over the order they print.

If you must run them in parallel, each instance of curl should write to its own logfile, IMHO. You can merge them after they're finished but not before.

Something like:

for i in {1..20}
do
 curl google.com -is >log-${i}.log &
done

wait

for i in {1..20}
do
        cat log-${i}.log
        rm log-${i}.log
        echo "request # $i"
done > log.log

I understand that the log will be mixed up as I am running the curl command in the background. But the echo request should be in the log for 20 times right? I tested my script in different computers and they all show request 20 times in the log. It is just not working in my computer on cygwin. If you look at my log.log above, I am fine with the curl response mixed up but I see only one line "request # 1". The remaining echos are missing. As you said it may be coincidental. But I am not sure why it works every time in other computers.

Thanks for the code you sent. I guess I can use it in my machine to fix the issue.

Not necessarily.

Process 1 writes to the file.

Process 2 writes a fraction later, and isn't aware the file has been appended to, overwriting the section the previous one just wrote. Or worse, overwriting part of the section, since one may be longer than the other.

That code works "a lot of the time" doesn't mean it won't occasionally blow up in your face, in ways you can't predict. In short, don't make them compete for the file like that.