Ignoring other considerations for a moment and in general ...
Would there be a difference in result (dot oh or execution) of:
A.
strncpy( a, b, sizeof(a) );
vs.
B.
c = sizeof(a);
strncpy( a, b, c );
My general understanding is (at least I think my understanding is) that in cases where something is inherently fixed in size then the compiler inserts the literal value of the size of the item at compile time as opposed to leaving an implicit run-time function call.
If my understanding is correct then it seems in scenario A. we'd have strncpy( a, b, 4 ) in one line of execute compared to two lines in B.: c = 4; strncpy( a, b, c ) .
Is that not correct?
Am I missing something?
Thanks in advance for any comments.
Geo. Salisbury
Long Valley, NJ
What I was after is what, if any, would be the virtue of using:
c = sizeof(a); strncpy( a, b, c );
vs.
strncpy( a, b, sizeof(a) );
I'm not a C maven (I can kludge along given good consult) but it seems to me that using the sizeof value in a separate int adds overhead while not altering the execution result.
I appreciate that we're talking fractional nano-seconds and negligible amounts but am interested in clarifying the precept.
is wrong, but won't actually hurt you unless a contains more bytes than fit in an object of type signed int . If you're using an external object to store the results of the sizeof operator, its type should be size_t ; not int .
Note that if you are in a subroutine that has been passed a pointer to an array of characters to be copied into an area of memory pointed to by another pointer to an array of characters, you have to be also be given the size of the destination array, as in:
I have to bail on this for now - will pick it back up tomorrow after considering your comments.
Thx.
Geo.
---------- Post updated 12-15-15 at 05:19 PM ---------- Previous update was 12-14-15 at 05:50 PM ----------
Finally able to return ...
The ability of 'c' to hold the
sizeof(a)
would not be a concern as 'a' is defined as an eight character, fixed-length string
char a[8]
for example.
All is happening in a closed domain of a running application. The present method in place is the one-step
strncpy(a,b,sizeof(a))
. The two-step approach came up as quote better but it struck me as accomplishing nothing different and "cost" the teensy bit of the size of the default int.
We'll leave the strncpy... as is with the sizeof... as one of the arguments.
Thanks for your thoughts.
Geo.
You seriously don't need to worry about overhead of an extra line of code.
Lets do some rough guess calculation here...
What you got a 2GHZ processor? That's 2000,000,000 cycles per second.
Lets say that extra line adds 20 cycles on the program. That's a 10 millionth of a second.
Reading a few bytes from a disk may well take in the order of milliseconds.
If you think it makes the program clearer then do it. I routinely add extra steps and variables to make my code more readable and I deal with vast amounts of 24 hour streaming data.
And my stuff flies.
FWIW,
The original query was not overtly based upon concerns of overhead or performance per se but, rather, [possible?] quote technical differences in compiled result.
The illustrated fragments using sizeof as arguments were in place in an area that was already selective in execution and time and volume did not come into play.
A discussion had started on the merits of an argument vs. a variable and the posting here was more for enlarging the set of opinions. I, too, will often include additional steps etc. in order to make a sequence more clear or to be able to include some step-wise commentary.
The upshot was we (I) left the use of the sizeof as arguments in place as is principally because we (I<g>) didn't have to do anything (Management 101 - 1st precept - do nothing) and because the use was in a block that already well commented leaving no ambiguity.
Thanks - that's pretty much what the takeaway has been.
In the code blocks here that gave rise to the conversation the sizeof was expressed in the using statements and not set as separate variables.
It seems that, for all practical purposes, the compiler treats sizeof as a placeholder for substituting the literal value of the size of the referenced item. That's good enough and is as far we need take it