forked from andypetrella/pipeline
-
Notifications
You must be signed in to change notification settings - Fork 0
Save and Restore Environment
Chris Fregly edited this page Jun 28, 2016
·
5 revisions
- The following steps should almost NEVER be done.
- If you want to temporarily stop your environment and start it later (ie for cost savings), follow this link.
- These steps are the LAST RESORT scenario when you are absolutely done with your exploration and wish to archive your environment for later.
- This is a very lengthy and bandwidth-intenstive process, so please make sure this is what you intend to do.
- Unless you've made a lot of changes that you'd like to keep, it's almost always easier to follow this link to start from scratch with a fresh Docker Image 0 with the latest code examples - when you're ready to explore again
- However, if you really wish to continue, here you go...!
-
exit
out of your Docker Container (again, this is a last resort) - This will stop your Docker Container as well as all services running within the Container
- The following creates a 13 GB+
.tar
file that needs to be downloaded from your Cloud Instance to your local laptop (very time-and-bandwidth consuming)
cd ~ && sudo docker export --output="pipeline.tar" pipeline
- Change permissions to allow the file to be readable for downloading
sudo chmod a+r pipeline.tar
- Download the newly-created
pipeline.tar
file from your Cloud Instance to your local home directory using your.pem
or.ppk
file from earlier:
# Linux/MacOS
scp -i ~/.ssh/<your-pem-file.pem> <your-username>@<your-cloud-instance-public-ip>:pipeline.tar ~/pipeline.tar
# Windows
scp -i \Users\<username>\.ssh\<your-ppk-file.ppk> <your-username>@<your-cloud-instance-public-ip>:pipeline.tar \Users\<username>\pipeline.tar
- When you are ready to restore the Docker Container from the downloaded
.tar
file on your local laptop, use the following - Note: It is almost always easier/faster to following this link to start with a fresh install unless you've made a lot of changes that you need to preserve
sudo docker load < ~/pipeline.tar
Environment Setup
Demos
6. Serve Batch Recommendations
8. Streaming Probabilistic Algos
9. TensorFlow Image Classifier
Active Research (Unstable)
15. Kubernetes Docker Spark ML
Managing Environment
15. Stop and Start Environment