Why is editing a file by renaming the new one safer?

Hello:
I've been reading about ways to edit files from the command line, and I've found two websites which state that the following is the safest way to edit a file:

command original > new
mv new original

That is, renaming the newer file to the previous one. This is what the websites I mentioned state:

(Source: BashFAQ/021 - Greg's Wiki)

(Source: �In-place� editing of files << \1)

What I don't understand is: what makes it safer? Why is it safer in the case the machine stops unexpectedly once the file was renamed? What's the logic behind this?

Thanks in advance.

I don't understand what point they're making either.

All you need to be is "professional". That is, before you edit a file, copy it (backup).

That way if you make a lot of (bad) changes and you want to see what it was like before you started, no problem.

Also, never delete any file that you're not sure about. Yes, this is where mv comes in (as they suggest).
I think that I need to delete this configuration file..............

mv existing existing.old

Whoops......shouldn't have deleted that? No problem, mv it back.
NEVER delete anything you're not sure about.

If the system crashes whilst you're editing a file, no problem, you've got an original copy.

Just be professional and you won't go wrong.

Hi
If I correctly understood the decribed method for file system without copy-on-write on disk and logging

Seems illogical to me.

When working on complex programming, I often take the backup even more seriously. I sequentially number versions to help me go back and find prior logic.
cp GreatScript GreatScript.v001
then continue editing GreatScript , until I get to another good spot and
cp GreatScript GreatScript.v002
and continue with this process.
Eventually, I often create an Archive folder, and then mv GreatScript.v* ../Archive

Sometimes, it is only apparent much later on that there was a major mistake in coding - and a need to go back further than only one version. Plus, while developing, I always feel better knowing more than one backup copy.

1 Like

They mean: copying a file, especially a big file, is done in steps. If there is a power loss in between, the status of the file is unknown: can be empty or partially copied.
In contrast, renaming/moving a file is "atomic": after a power loss it's either 100% the new file or 100% the old file.

But with a copy speed of >100 Mbyte per second and "journaling" it is a bit paranoid to assume a power loss within these microseconds, IMHO.
So that's why I prefer to keep the inode intact (including attributes and link count), and simply assume that power will last for the next microseconds.

1 Like

Practically speaking,

It depends on your risk management model.

If your system is prone to crashing or locking up, then it might be a better idea to copy the file to another server and do the edits, then load it up to the server and move it into place.

Sounds fishy, however, if your server is so unstable that it is prone to crashing or has such resource problems.

Normally, and I mean everyday on remote, production servers, I copy the file I want to edit and add a " .backup " or " .neo " extension on it, or something like that. But I generally edit the original file and save it to disk when I'm done.

When editing, you are editing a copy in memory, not the copy on disk; so if the system crashes while you are editing, you only lose the changes in the editor, not the file on disk.

I guess, one could say that when you cross the street, you should look right, then left, then up, and then down, and to be safe, look behind you too. However, most of us look right and left. If you want to edit copies and move them that's cool but it is not going to change much in your life compared to editing the original and saving it.

What is important, as mentioned by others and also by me again here, is to make a quick backup copy of a file before . you edit. I do this most of the time, even when I have offsite backups.

Making a copy, editing the copy, and moving it to replace the original file is still "not perfect" because you have still written over your original. You should at least make a copy, edit the original, and save it, knowing you have a fresh backup. If you copy the original, edit the copy, and move it to overwrite the original, where is your fresh backup? You don't have one (in this scenario). Ditto if you copy the file you just edited over the original, you then have two potentially "fat fingered" copies.

So, what's the point? What is the risk? What is the system vulnerability you are trying to mitigate?

1 Like

I am largely with Neo - depending on what file you are modifying, you might not even need a backup copy at all. For example, I often create feeding files for loops - and afterwards modify them (add or remove things) - these are my very own files and I usually can recreate them very easily if I ever have to - so these I modify without any backups. System files however should always be modified after a copy - ideally a copy where ownership and permissions are the same as the original - so if anything ever goes wrong - all you have to do is rename the original file to something like .old and your copy to the original filename.
BTW - in 32 years, I have not lost a single file to a system crash.

I agree.

For many non-critical files, where I am making small, incremental changes, I often do not make a fresh backup copy, especially because I have off-platform backups as well; and edit the file directory and save it, as normal.

Like zxmaus, I cannot recall every losing a file due to a system crash while editing a file, in over 40 decades of working with computers.

However, I do recall making a lot of "simple human mistakes" and have learned to be "saved by backups". This leads me to always recommend people make and maintain filesystem backups, based on their risk management model (criticality, vulnerability, threats).

These days, more-often-than-not, for a increasing majority of my file edits, if they are significant, I will sftp the file to my desktop, open the file in Visual Studio Code (or cut-and-paste into VSC if a small file) , edit the file using all the available syntax and formatting tools and plugins, and save the edited file with a different name, preserving the original file on my working directory on my desktop, and then I will either sftp or cut-and-past into the remote server over an ssh terminal.

I cannot count the number of times VSC has been helpful to spot a syntax error which missed my tired, overworked eyes. The formatting is also useful (indentations, consistent formatting, etc) is also very useful in VSC. These kinds of tools are really time savers, especially for syntax checking.

It goes without saying, I use vi every day to edit files; but I also use vi in conjunction with VSC, more and more; for the syntax checking and formatting for code (programming languages) and JSON files, etc. But as I am quick to confess I do edit files with vi and do not make make a backup copy, but not often; but if it is some small change which i can easily revert-back based on "memory", then I am guilty. I also push files to private GIT repositories as well, when my work on critical files are done. GIT is Good for backups :slight_smile:

I agree also,

I did lose files after crashes but in the early 90s on HP-UX 8, or Linux 0.99... can't remember after with JFS...

As Neo mentions, its not the OS crash the vulnerability but more the Human intervention on those files been modified e.g I saw Sun servers after a reboot where no one could connect, because someone modified the passwd file NOT using vi... and many more similar cases, so for peace of mind I always make a copy I modify and once done add .ori to the original so whatever I can compare... My remarks apply mostly to any Unix configuration files except sudoers need editing with vi ( as depending on the state of the machine is the only editor working...)

It is true that modern editors as Neo described have funky functionalities that are more than simple cosmetics and would be a pity not to use, but also as he adds: with caution..

What I wanted to mention was more: In a panic mode you tend to forget a lot of things like when a system crashes, you may well have a mail ( but who looks at his mbox?) from the system saying you were editing a file and the system saved the state of the file under the name XXXXXXX, you open them with vi -r depending the content and how bad the crash you may find the system managed to leave the original in his precedent state ( so before editing or last save ) and have in XXXXXX the last state of current modifications of the file, it may not be complete but at least you haven't lost all, and most important it avoided your system some unwanted file corruption... this may be the reason of with JFS no one can remember losing anything these last 25 years...

But it's not a reason for not being careful when doing sysadmin tasks