Announcement

Collapse
No announcement yet.

Caught Off Guard: Need a solid backup plan

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Caught Off Guard: Need a solid backup plan

    At the beginning of this year I moved from Ubuntu 10.04 to Kubuntu 12.04. Before I installed Kubuntu I more or less followed rigid backup plan. After I installed Kubuntu I got a little relaxed. I was busy configuring my desktops and playing with the desktop effects. My backup computer/file server is an old Pentium 4 computer from 2002 which has been rock solid for years. I failed to realize it has aging hard drives. Today with little warning one of the data drives started to fail. I heard the thrashing and clicking and immediately started backing up the most important data, family photos. Next I backed up the family vidoes. The drive died before a got to the documents. Fortunately my current files are on the newer computer. I have multiple backup copies and occasionally back up to data DVD which minimized the loss.

    Wow! What a helpless feeling and a lesson learned.

    life0riley
    sigpic

    #2
    Indeed, one backup is little better than none...
    I back up to several USB drives, things like photo's every day.
    One of them is in a different place and receives a copy about once a month.

    The cloud is from a technical perspective a fine solution but from a social/privacy point of view totally unacceptable.
    I have been experimenting with OpenStack but it's not yet running as easy as I wish.

    An ssh connection with incremental BU software is right now a reasonable option.
    Just don't do the KDE git thing

    Comment


      #3
      Originally posted by Teunis View Post
      The cloud is from a technical perspective a fine solution but from a social/privacy point of view totally unacceptable.
      It's fine if you encrypt first. I have nightly cron job that backs up everything to an S3 bucket on Amazon Web Services, but first it's encrypted locally. AWS never has the key.

      Comment


        #4
        Thanks Tenuis. I like the idea of multiple USB drives on a rotating basis. It certainly minimizes the loss if something fails. I can adopt this while I look at some backup software options. lol...it won't be anything like the KDE git thing.

        I haven't really looked at the cloud option SteveRiley, but encryption makes it a possibility for me. I'll do some reasearch on this.

        Before I switched to Kubuntu I used cp to backup windows shares to a drive on my local machine. I never got around to putting this in cron.

        Code:
        cp -ruf /home/life0riley/.gvfs/"users on computer01"/life0riley/Pictures/"life0riley Pics"/"PICTURES 2012" /home/life0riley/Pictures/pictures_2012
        I've noticed the Windows shares I access via Dolphin do not appear to be mapped to any specific location that I can find. I searched some posts and this seems to be the consensus.

        Thanks for the Replies,

        life0riley
        sigpic

        Comment


          #5
          Originally posted by life0riley View Post
          I haven't really looked at the cloud option SteveRiley, but encryption makes it a possibility for me. I'll do some reasearch on this.
          I worked for Amazon Web Services for a couple years, and still do quite a lot in cloud computing now. I'm very familiar with what works and what doesn't, so if you're looking for something or come up with questions, just let me know.

          Comment


            #6
            This is an older thread, but I thought I'd follow up here.

            I started using Smb4K. I struggled a little in the beginning setting it up properly, but it is working fine for me now. I now have a directory where I can find my mounted share under ~/smb4k, and I'm currently backing up manually while I look into a more automated option. I prefer a synchronize option with an exact copy of my files on another disk. I don't want to rely on software to restore my backup.

            I started reading up on AWS. As usual, I'm slow to jump into something new without understanding it completely. I'll need to figure out what my needs are and see if this is something feasible for me.

            I also need to read up on encryption software. I already have KGpg installed. I'm using Kubuntu 12.04. Are there better options out there for encryption?

            Thanks,

            life0riley
            sigpic

            Comment


              #7
              Originally posted by life0riley View Post
              I prefer a synchronize option with an exact copy of my files on another disk. I don't want to rely on software to restore my backup.
              rsync is the tool you'll want. While it's typically used to synchronize files between two hosts, you can in fact use it to synchronize files between separate locations on the same host -- say between two drives or subdirectories.

              There are number of GUI front-ends to rsync; options include luckybackup and backintime-kde, among others. These are in the repository.

              Originally posted by life0riley View Post
              I also need to read up on encryption software. I already have KGpg installed. I'm using Kubuntu 12.04. Are there better options out there for encryption?
              It's difficult to beat the command line here.

              Create and store your passphrase in a file, otherwise you'll be prompted for this every time:
              Code:
              echo "some passphrase I really like" > ~/passphrase
              Encrypt a file:
              Code:
              openssl aes-256-cbc -e -in [i]/path/to/file[/i] -pass ~/passphrase -out [i]/path/to/encrypted-file[/i]
              Since you're storing the passphrase in a file on the local computer, encryption is really useful only if you plan to store the encrypted files elsewhere -- say on another computer, on a removable drive that you will later disconnect and put in a safe, or on a cloud service. Just don't store the passphrase in the same place as the encrypted files!

              Decrypt a file:
              Code:
              openssl aes-256-cbc -d -in [i]/path/to/encrypted-file[/i] -pass ~/passphrase -out [i]/path/to/decrypted-file[/i]
              Notes
              • The encryption algorithm is AES, with a 256-bit key, in CBC mode. Do not use ECB mode, as this doesn't add sufficient randomness -- identical plaintext portions are encrypted into identical ciphertext, which makes pattern detection pretty easy.
              • The encryption form of the command uses -e as the second parameter.
              • The decryption command uses -d as the second parameter.

              Comment


                #8
                Thanks! This really helps.
                sigpic

                Comment


                  #9
                  I'll second the use of rsync. I've been using it for years. It took me a while to get the syntax right, but once that is done, it is FAST as it only backs up incrementally -- even within a modified file, IIUC. I use rsync on my local LAN, but it is capable of working across the Internet as well.

                  I have several Linux machines (netbook, laptop, desktop, media PC, my wife's laptop, and the machine at the shop). I create a separate /data directory on all my machines, and I mirror that /data directory among all of them with rysnc. If one machine dies, I go and sit down at another one and carry on. The only thing lost would be changes made on the current machine between its death and the last backup. Up until recently, when I sold our business, I also had an offsite backup on my desktop machine at work, using my laptop as 'sneakernet' to keep it up to date twice a month.

                  One problem with my system is that it is TOO good. When deleting files for good, I have to delete them on all 6 machines, or else the files come back again at the next sync. But I've never lost a byte despite two disk deaths this past year. I think there is a way to get rsync to propagate the deletions for me as well, but I've specifically avoided using it in case I do delete something I want by mistake, and then don't discover that it is missing until after the deletion has propagated itself across all my machines.

                  I think I've solved the offsite backup issue by using an external eSATA enclosure with a standard 1.5 TB hard disk. I just plug that disk into the eSATA dock, and run rsync. The disk and an extra dock I keep at my father-in-law's place about an hour out of town, and sync it with my laptop when we visit him. I only sync that once every month or so, but it is a good failsafe for archived material, like the past 20 years of videos of my daughter as she grew up, and all our digital photos (scanned or taken with digital camera) over the past 40 years.

                  Frank.
                  Last edited by Frank616; May 19, 2013, 10:06 AM.
                  Linux: Powerful, open, elegant. Its all I use.

                  Comment


                    #10
                    Thanks Frank! Now I have some very good information to design my own backup plan. The final touch will be off-line storage just in case something happens to my data storage at home.

                    I appreciate the replies.

                    life0riley
                    sigpic

                    Comment


                      #11
                      I use Ubuntu One and backup all of my documents and pictures to the cloud. You get 5GB free, and it is only $2.99 a month for another 20GB if you need it.

                      Comment


                        #12
                        That made me think and put things into perspective. I'll definitely have to pay for storage. Our photos are the most valuable. Back in 2004 we digitalized our family photos for both sides of our family. Some of the photos on my side date back to the 1920's. We sent disks to family members and kept backup sets for ourselves. Our own family photos are added to this, so the storage continues to grow. We want to preserve this for our children.
                        sigpic

                        Comment


                          #13
                          rdiff-backup is worth looking into as well.

                          Please Read Me

                          Comment


                            #14
                            Detonate:

                            You get 5GB free, and it is only $2.99 a month for another 20GB if you need it.
                            Once one gets up over 100 GB of files (my situation), however, doesn't this get a bit pricey over the long haul?

                            I hate getting into subscriptions / contracts. I learned that with the cell phone carriers. That 'only $x.xx/mo' soon adds up over a year, or several years. A 2 Tb hard drive is currently $100 where I live. I'd pay for that in less that a year.

                            Frank.
                            Linux: Powerful, open, elegant. Its all I use.

                            Comment


                              #15
                              When you use an offsite service, you aren't paying just for the storage. You're also paying for whatever redundancy and durability provisions the service has included.

                              Since you're looking for a way to archive data, a better service would be one that optimizes its charge structure for write-lots, read-little. Amazon Web Services has a pretty neat feature called Glacier, designed just for this purpose. It's built on S3, which means you automatically benefit from multiple copies of every object stored. It's priced to encourage you to store stuff frequently, but not use it for regular retrieval. $0.01 per gigabyte per month is very cheap, considering Glacier has the same durability as regular S3. Data transfer in is always free. Data transfer out is free for the first gigabyte each month. Because Glacier is designed for archival purposes, retrieval is scheduled at intervals and isn't immediate.

                              Let's say you want to upload 1000 GB to Glacier. Let's further say that on a monthly average, you'll have 1000 upload/retrieval requests, 100 GB inbound per month, and 10 GB outbound per month. Your monthly bill will be $11.13 -- I calculated this based on the prices for the US-West-2 region in Oregon. US East would be comparable.

                              Please don't interpret the above as a commercial. While I no longer work for AWS, I'm still an admirer of their technology. Very few providers can match their technical capabilities. My entire digital life is archived at AWS, which allows me to sleep very soundly at night.

                              Comment

                              Working...
                              X