Writing a file to RAM within Bash and using it

I've got a few scripts I use for various things, but there is one I have taking the output from a specific command, writing it to disk, and repeating for another command, then reading both (after some formatting) and writing the output to another file.

Part of the reasoning for writing to the two files before catting the output into the single file is to try and keep things organized since the two commands are running side-by-side (with &, of course).

What I'd like to know is if there is a good way to write the output from the two commands into RAM before catting/redirecting it into the lone file.

Anyone know of any good ways of doing this?

Only as a file held on a RAM disk.

Alternatively while the data is in a pipe it is in RAM.

Sounds like you are actually wanting a temporary file that automatically clears itself up.

If you want to deal with memory etc, write a C program. Shell scripts deal at the process/program/file level.

I was afraid of that. Currently what I've got is the output is directed to a file for each command, then catted together. I was hoping to be able do this within RAM, since the two commands themselves are a bit I/O heavy.

You have any clue (or someone else) on a good method to have something held within a pipe or something, since I'm really hoping to have to avoid doing this all in C (which I would have to learn to do, as well. Guess its about time I learned...), I'd like to know if I'm looking for the impossible.

Then why not have the commands write to stdout, then run one command then the other?

Are you refering to something likle

program > $somefile
$format_cmd >> $somefile
program2 >> $somefile

or something else? The reason I have these programs running side-by-side is the length of time (well, actually, its only ~20+ seconds when they run one after the other, but doing it how I have now has that down to ~14 seconds), and I'm hoping to be able to avoid that, plus this would be something nice to know for later on.

What about storing the output in a variable, then echo/cating that into the last file? Granted the output is a bit lengthy, and it could cause some problems, but would that be possible?

Hey, I'm doing similarly heavy neccessary I/O operations

the thing with putting it into a variable, is you lose the carriage returns. Thats the only problem I can see, it's a bit messy too. You could substitute carriage returns with an unused character such as (�) and translate them later,.

i.e filevarsub=$(echo $somefiledata | tr -t "\n" "�")

then bring it back to write to the final file output

echo $filevarsub | tr -t "�" "\n"

simple, but inelegant.

secondly, you could mount a ramdisk,

mkdir ram # in your main scripts directory
sudo mount -t ramfs /dev/ram ram #there are no options for ram, so umask etc doesnt work
sudo chown $USER ram #i$USER is who you're logged in as

you could stick it /etc/fstab to automount on boot just for your scripts,
or you could fix it so that mounting doesn't require root permissions (be careful with that one), so you can mount from within the scripts

I'd love for there to be a better way, if anyone knows please do tell

Something else....

run_all()
{
     program
     $format_cmd
     program2
}

run_all >output

would concatenate the outputs into "output" without temporary files.

use mkfifo.

That's a pipe in runs in the ram, just restructure your code to read and write it

e.g

mkfifo pipename
cat $originalfile | someprocessingstuff > pipename & grep "extractsomedata" pipename

the & creates the dual process system required by fifo pipes