I have a script that loops and exports to a .CSV file foreach loop.
So it -appends the .CSV file many times.
No problem with this:
| export-csv C:\FolderX\FileX.csv -NoTypeInformation -append
Now I would like to put a Time Stamp in the file name.
So I have this:
$TimeStamp = get-date -f __\Da\te_yyyy-MM-dd__Ti\me_HH_mm_ss…fff___tt
It looks ok like this: FileX___Date_2023-01-21__Time_10_32_24…884___AM
The problem is that every time it exports, it creates a new file.
I tried to fix it by adding this:
$TimeStampx = “c:\homeuse\FileX_$TimeStamp.csv”
And then using this:
| export-csv $TimeStampx -NoTypeInformation -append
But I get a different file witten for each loop
This is what I get:

So how can I get just one Timestamp file ?Thank you in advance for your help.
6 Spice ups
I just figured it out.
I needed to put this part before the loop, not before the export.
$TimeStamp = get-date -f _\Da\te_yyyy-MM-dd__Ti\me_HH_mm_ss…fff___tt
$TimeStampx = “c:\homeuse\FileX$TimeStamp.csv”
Silly me. I am an old programmer. I should have known better.
It works now.
Thank you for reading this.
4 Spice ups
You’re correct. You need to set the variable outside of the loop otherwise it will always change based on the current time.
I start my scripts like this:
$StartTime = Get-Date
That way I can later use the DateTime object as a comparison and then use it to create/update filenames like so
1..10 | ForEach-Object {
Write-Information -MessageData "Drive:\Path\Filename-$($StartTime.ToString("yyyyMMdd")).txt" -InformationAction Continue
}
Since we’re not privy to the rest of the script, it’s not ideal to output each line of the csv at a time especially since Export-Csv can be called once on a collection of objects. In any case, glad you got it sorted out.
1 Spice up
Said Brandon, Thank you for your reply.
I am intrigued. If I understand you correctly, Export-Csv can be called only once instead hundreds of times? Sometimes I end up with a collection of hundreds of objects that end up on an Excel .CSV file.
Thank you for the code you supplied. But I don’t understand it yet. I will have to search for an example.
Thank you very much.
Yes. Export-Csv will handle your array of objects. Again, since I’m not sure of your dataset,
I’ll give you and example of the recommeded i.e. best approach
Get-ADComputer -Filter * | Export-Csv -Path "C:\Path\File.csv"
instead of
Get-ADComputer -Filter * | ForEach-Object {
Export-Csv -Path "C:\Path\File.csv" -Append
}
1 Spice up
gary-m-g
(Gary M G)
6
You can collect the data you need within your foreach loop by using a PSCustomObject and assigning the value to the loop.
Sort of like:
$myData = foreach ($thing in $things){
...do stuff
[PSCustomObject]@{
VALUE1 = $thing.Attribute1
VALUE2 = $thing.Attribute2
Value3 = $someCalculatedValue
}
}
Then, you export once, after the loop finishes:
$myData | export-csv -NTI MyFile.csv
jimlong3
(Jim6795)
7
File access is horrifically slow, in any case. Mostly, it seems to me that, people use .CSV files as a crude “inter-process communication” medium. There are vastly better ways to share data.
But if you must, then the winning magick is to read or write the whole file, one time only.
SaidBrandon, Thank you very much for that example. I will study it and try it.
Gary M G, And thank you very much for that example, I will study it and try it too.
Jim7695, I have not experimented with any other kind of output because that’s how Don Jones showed how to do it in his “…Month of Lunches” book. Ha! And I often use these .CSV Exported files as good Import files for other scripts. I may try to use .txt files some day. I also write-output to the console because it is quick. What else do you recommend? Thank you.
1 Spice up
jimlong3
(Jim6795)
9
That depends on where the data are going. For anything related to Active Directory or WMI, that’s the Source of Truth anyway, so I just use it. For “further processing”, I just pipe the PSCustomObject from one script into another…
For ordinary lookups, I kill puppies with Write-Host, but I confess to storing anything I want to keep in .CSVs for convenience. It’s useful for keeping the Structure of the data as well as the data itself. XML and JSON look interesting, but compatibility is more important to me.
Not much of an answer, I apologize…
Csv’s have their place since they can easily be opened by excel by “double clicking”. The disadvantage is that they aren’t structured in any way. Just a row of data put into columns i.e. Comma Separate Values. I only use them when my task requires I share with others, but I typically open in excel, format as table, and save as xlsx. As far as raw data goes, everything is stored as JSON. It allows much more flexibility when you have structured/hierarchical data. I also prefer it to XML.
As a few examples of structured data:
All your ADUsers and the groups they belong to
All your WSUS clients and their missing updates
All your Virtual Machines and their associated NIC’s, Tag’s, and so on
Can a CSV do it? Sure, but do you want to? I don’t.
One final point. PowerShell is about Objects and JSON stands for “JavaScript Object Notation” and CSV is Comm Separated Values.