No child process is allowed to change the environment of its parent process. So if your awkscript had set a VNAME variable explicitly, this would be valid in its own scope, but would NOT be set in the calling script.
To work round this, the calling script has to explicitly accept the value into its own scope using one of its own built-in commands, and Sam’s declaration is exactly right to do this.
This is good for returning one value, but there are also methods for a sub-script to return multiple values in various forms.
Suppose awkscript needs to return 4 words on one line: val1 val2 val3 val4.
In ksh, you could accept all 4 values into separate variables using:
awkscript | read V1 V2 V3 V4
If the 4 values could contain spaces, you would print them on separate lines, and accept them using:
awkscript | { read V1; read V2; read V3; read V4; }
Another technique is to have the sub-script return a complete ksh command. For example, if awkscript printed one line of text containing:
export V1=‘val1’ V2=‘val2’ V3=‘val3’ V4=‘val4’
then the outer shell can accept all the values using:
eval $( awkscript )
Bash needs a different syntax (using Here Strings) to do the reads shown above. Ironically, this is because in ksh pipelines, the “read” built-in runs in the calling shell itself, whereas in bash it runs in a sub-process and therefore the variables it sets are in the wrong scope. The equivalent in bash is:
read V1 V2 V3 V4 <<<“$( awkscript )”
There are other cases of this environment issue in ksh. For example, I had a cd that echoed a pathname (because it used CDPATH), and I was investigating whether the path had any non-printing characters. So I wrote:
cd “${DIR_USER}” | cat -vet
This stops the cd being effective! the cd is now run in a subshell because of the pipeline, and the original shell stays in the current directory. But the line below works, because the redirect does not invoke a pipeline:
cd “${DIR_USER}” >( cat -vet )