How to clean memory after command run?

Hello, i have an loop (while) in bash and i seen after a while sctipt returned like:

wget: error while loading shared libraries: libselinux.so.1: failed to map segment from shared object: Cannot allocate memory
wget: error while loading shared libraries: libcrypto.so.6: failed to map segment from shared object: Cannot allocate memory
./wgetter: fork: Cannot allocate memory

So i want to ask which command i need to use to reclaim memory used by some command in loop, it appears like this loop causes memory to fill?

so far i found:
clean up memory on linux | commandlinefu.com

but not sure if any of these are good to use, the second one is a big script, probably not handy for a loop..

I believe you need to check the .so file , which one is existing or not .

libselinux.so.1
libcrypto.so.6

use locate command

It sounds like you are creating child processes that continue to run. If that were not the case, the process you create (from running whatever command you have) will have released memory when it exits. I am assuming you did not write some sort of C code that creates kernel persistent objects like semaphores or shared memory and then exits without cleaning up after itself.

I assume it has to do with this question?

thx, im using just simple wget command with two its parameters..

it works, but after some time it feels like it fills the memory and fails with mentioned error...

If you would like useful advice, post the exact script that you're running. Otherwise, you're just wasting everyone's time (including your own).

Regards,
Alister