Description
Note this is a BASH script which requires Linux. The Windows Subsystem for Linux (WSL) may work if you don’t have a linux server or VM running somewhere.
Splits a .json output file from a Spiceworks legacy app or cloud help desk into smaller files. This should help get failing imports into the cloud help desk when you have more than 2000 tickets.
This file relies on jq (tested on v1.6) being installed Redirecting to jqlang.github.io
Run this in an empty folder as any numerically named files will be renamed at the end of the script
Usage ./json_ticket_split filename.json
*When importing, deselect “Keep the same ticket number shown importing” as this will not work with multiple import files (your first import will be overwritten by the second and so on).
Source Code
#!/bin/bash
# Splits a .json output file from a Spiceworks legacy app or cloud help desk into smaller files
# This file relies on jq (tested on v1.6) being installed https://stedolan.github.io/jq/
# Run this in an empty folder as any numerically named files will be renamed at the end of the script
#
# Usage ./json_ticket_split filename.json
deletetickets() {
jq --arg tickstart "$tickstart" --arg tickend "$tickend" 'del(.tickets[($tickstart|tonumber):($tickend|tonumber)])' $sourcefile > $filenum.tmp
mv $filenum.tmp $filenum
}
echo "Splitting Spiceworks .JSON file $1"
#Alter the value below if you're splitting up by a different ticket quantity per file
ticketsperfile=2000
filenum=1
sourcefile=$1
tickstart=$ticketsperfile
tickend="$(jq '.tickets | length' $1)"
total=$tickend
echo "Working on file $filenum"
deletetickets
((filenum++))
loop=0
ticknext=0
until (( "$ticknext" > "$total" ));
do
echo "Working on file $filenum"
sourcefile=$1
tickstart=0
loopmultiply=$(echo $(( loop * ticketsperfile )))
tickend=$(echo $(( ticketsperfile + loopmultiply )))
deletetickets
ticknext=$(echo $(( tickend + ticketsperfile )))
tickstart=$ticketsperfile
tickend="$(jq '.tickets | length' $filenum)"
sourcefile=$filenum
deletetickets
((loop++))
((filenum++))
done
#renaming files
ls *[0-9] | sort -n | cat -n | while read n f; do mv -n "$f" "split_export_$n.json"; done
echo "File splitting complete!"
Screenshots
2 Spice ups
Nevermind. Replied for the wrong topic. Here are more words.
1 Spice up
mwit2
(MyWood)
3
Hello - Windows user here, who is not very familiar with BASH scripts. I was able to install Linux on a test machine, and have tried running this script to no avail. Can you please elaborate how to run this script, and how to direct it toward the json file so it knows what to split? Thank you so much!
1 Spice up
We have one thing which we find a bit tricky, and that is that there is no ability to sort or filter by custom attributes in the Tickets section. In the Reports section, Custom Attributes are not able to be considered. Is it possible to include sorting/filtering for custom attributes? This would be massively powerful for us and, I assume, other users too.
1 Spice up
Hello Windows users, you can also use Git Bash to run this script if you have it but first you will have to instal jq and Chocolatey for which I followed these steps.
1 Spice up
how do i split one of the output files into further splits? file number 1 of 8 (and only file number 1) fail to import, so i’d like to break it into further chunks. However If i change “ticketsperfile” to 200, it all fails apart whether i run against file number 1, or even the master exported_data.json? how do i break the output into additional chunks or otherwise further split up the jsons in order to skip whichever ticket is tripping it up? ELI20?
1 Spice up
If the already split file fails it’s likely a different issue than size, so further splitting wouldn’t help. Your best option is to try to figure out the problematic ticket/s and either modify or remove them prior to export.
1 Spice up