Downloading the pruned file from NKN (https://nkn.org/ChainDB_pruned_latest.tar.gz) for me the first time was fast, but that was right before the huge influx which consequentially made the downloads take as long as 2-3 days on any node.
Backing up ChainDB
Correct me if I am wrong, but I believe you can just copy the ChainDB
folder to any other node and it will pick up from there, sometimes pruning before start. I ended up downloading once from the NKN network, syncing it with a single node, then backing up and distributing it to other nodes like my home server.
Compress on same machine:
cd /home/nkn/nkn-commercial/services/nkn-node/
sudo tar -czvf ~/ChainDB.tar.gz ./ChainDB
Limited space option (tunnel and compress on another server over ssh):
How To Use tar Command Through Network Over SSH Session
ssh user@address tar czf - /home/nkn/nkn-commercial/services/nkn-node/ChainDB/ > /destination/ChainDB.tar.gz
Another Edit: Forgot to mention, it’s probably a good idea to stop the service before copying files, at least in my mind. Copying a file if its being written to can be problematic.
Note: I’m a bit of a beginner with shell scripting. For any of my scripts, please look over carefully before using incase of mistakes or mismatches from what would be needed on your system.
Fetching Backed Up ChainDB
What I did was create a compressed tar file of ChainDB
and put it on my main VPS without bandwidth usage rates. Then I made a script that deletes the ChainDB
folder, downloads the tar, and extracts it. I did notice that, for nodes with 25GB, I did not have enough space to hold the tar and extract it at the current height. I needed about 31GB for height 2,560,819 to hold the tar and extract it.
You can create a bucket in one of the VPS providers like Google Cloud or AWS and distribute them if you have slow home internet, but you pay for the bandwidth. I have fiber at home with amazing latency, but only pay for 35Gb/s up/down, and me being impatient, ended up putting the files on my VPS.
Here is a part of a Linux script I made except I changed it for a Google Cloud Storage bucket since I use a plain vanilla wget from my VPS. You can allow any Google Cloud project and the underlining servers have access to the cloud storage by adding the projects {project_number}[email protected]
IAM account as a member in the project with the bucket. Then giving the user bucket permissions. You can also make them open to the internet, but may pay more in bandwidth. You can also just replace the gsutil
line with your download method of choice.
#Stop NKN Node
sudo systemctl stop nkn-commercial.service
#Setup temporary folder
mkdir ~/nkntmp
gsutil -m cp -r gs://{project_name}/tmp/ChainDB.tar.gz ~/nkntmp
#Extract settigns and ChainDB
sudo rm -r /home/nkn/nkn-commercial/services/nkn-node/ChainDB
sudo tar -xzvf ~/nkntmp/ChainDB.tar.gz -C /home/nkn/nkn-commercial/services/nkn-node/
# Fix permissions
sudo chown -R nkn:nkn /home/nkn
#Start NKN Node
sudo systemctl start nkn-commercial.service
#Cleanup
#sudo rm -r ~/nkntmp #Currently dissabled incase of booboo.
Note: Check before using to make sure the NKN node folder matches as well as for any other mistakes.
Sources that helped me