How to avoid "Too many arguments" error, when passing a long String literal as input to a command?

Hi,

I am using awk here.
Inside an awk script, I have a variable which contains a very long XML data in string format (500kb).

I want to pass this data (as argument) to curl command using system function.
But getting Too many arguments error due to length of string data(payloadBlock).

I have tried the below ways, but both error out with code: 127 (Too many arguments)
Awk-extracted value inside the variable(payloadBlock) looks like:
payloadBlock="<?xml?><root> ... ... (lot of xml data) ... ... </root>";

CURL_RETURN_CODE=system("curl -s -S -X POST "URL" -H \"Content-Type: text/xml\" --data-binary '"payloadBlock"'");
CURL_RETURN_CODE=system("echo '"payloadBlock"' | curl -s -S -X POST "URL" -H \"Content-Type: text/xml\" --data-binary @-;");

Could you advise any better way to do this?
NOTE: I have to invoke curl from within awk only, as I am computing this string data inside awk script.

What language is this? Shell language has no system() command nor any need for such.

To post a lot of data,

echo "all my post data" | curl -d '@-'

I am using an awk script, and passing the script to awk command using -f switch.
awk has a function called system() to execute shell commands.

In your answer, the string "all my post data" is short.
But if it is a really long string, the command will fail with error: Too many arguments.

The way to avoid too many arguments, is to not use arguments. Use a pipe.

Do this outside awk, and pipe its output into curl. @- tells it to read from standard input instead of arguments.

awk -f program | curl --data-binary "@-" ...

This would work if I wanted to transmit all output of awk in one HTTP POST.
But if I want to POST each record extracted by awk separately, I have to perform curl from inside the awk script.