Announcement

Collapse
No announcement yet.

okay, now i need a little help ;-)

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    okay, now i need a little help ;-)

    ...I can duplicate this in either Natty or Squeeze and it's causing me a fair bit of extra work every day - mget doesn't work in either distribution.

    I've got a cron job that runs on my leased VPS that makes a sql dump then creates a .tgz archive of the entire site in my home directory on the VPS. The script creates a file using the current date as part of the filename like this -

    backup-`date -I`.tgz

    Since ftp isn't a shell I can't use a variable in the ftp script so I have to use mget and mdelete - both of which worked fine until Natty.

    Here's what happens if I try to use mget against the server -

    wizard@wizard-netbook:~$ ftp vps.com
    Connected to vps.com.
    220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
    220-You are user number 1 of 50 allowed.
    220-Local time is now 10:57. Server port: 21.
    220-This is a private system - No anonymous login
    220 You will be disconnected after 15 minutes of inactivity.
    Name (vps.com:wizard): wizard
    331 User wizard OK. Password required
    Password:
    230 OK. Current restricted directory is /
    Remote system type is UNIX.
    Using binary mode to transfer files.
    ftp> mget *.tgz
    ftp>
    ftp> quit
    221-Goodbye. You uploaded 0 and downloaded 0 kbytes.
    221 Logout.
    wizard@wizard-netbook:~$

    See the empty ftp prompt? That's all I get. I can duplicate this on Natty and Squeeze but my webhost's technical support can do it using my account with no trouble, so I don't think it's server side.

    Interestingly, if I issue manual commands with gftp mget works but mdelete doesn't.

    Anybody got any ideas?

    thanks -
    we see things not as they are, but as we are.
    -- anais nin

    #2
    Re: okay, now i need a little help ;-)

    Are you sure that your files reside in the / directory of the ftp server?
    Alternatively try getting the file with get and decompress them manually.

    Comment


      #3
      Re: okay, now i need a little help ;-)

      Originally posted by walto
      Are you sure that your files reside in the / directory of the ftp server?
      Alternatively try getting the file with get and decompress them manually.
      Yes, the file exists in that location. get works just fine and I don't need to decompress it - it's a daily website backup. The problem is that since the filename changes every day I have to use mget instead of get in my download script.

      The script has worked for years - it quit working when I installed Natty and doesn't work in Squeeze either. mget doesn't work even in a manual ftp session. but does work with other distributions - the problem is client-side.

      I'd be interested in whether anybody running Natty who also has access to a remote ftp server also sees this problem.
      we see things not as they are, but as we are.
      -- anais nin

      Comment


        #4
        Re: okay, now i need a little help ;-)

        @wiz, your procedure is way over my head. However, I looked for your mget command on my Natty system and found this:

        Code:
        don@natty-vm:~$ sudo apt-cache policy mget
        N: Unable to locate package mget
        don@natty-vm:~$ sudo apt-cache policy get
        N: Unable to locate package get
        don@natty-vm:~$ sudo apt-cache policy wget
        wget:
         Installed: 1.12-2.1ubuntu2
         Candidate: 1.12-2.1ubuntu2
         Version table:
         *** 1.12-2.1ubuntu2 0
            500 [url]http://us.archive.ubuntu.com/ubuntu/[/url] natty/main i386 Packages
            100 /var/lib/dpkg/status
        don@natty-vm:~$
        :P

        Comment


          #5
          Re: okay, now i need a little help ;-)

          Thanks, dibl - mget is internal to the command-line ftp client. It's not a separate package

          mget is the same as get but will grab multiple files without prompting -

          F'rinstance - if you've got four files

          1.tgz
          2.tgz
          3.tgz
          4.tgz

          mget *.tgz will get them all without asking for confirmation. Since the filename changes every day that's the way my script downloads the daily website backup.

          The download script looks like this -


          #!/bin/bash

          HOST=vps.com
          USER=wizard
          PASS=xxxxxxxx

          cd /home/wizard/archive/website

          ftp -inv $HOST << EOF
          user $USER $PASS
          binary
          mget *.tgz
          mdelete *.tgz
          bye
          EOF

          find /home/wizard/archive/website/*.tgz -mtime +6 -exec rm {} \;

          exit 0

          It logs on to the ftp server, grabs the website archives, deletes them from the server then removes any archives more than a week old from my local PC.
          we see things not as they are, but as we are.
          -- anais nin

          Comment


            #6
            Re: okay, now i need a little help ;-)

            Ahhhhhhhhhh. OK, cool, I learned something today.

            gftp is about as deep as I go with file transferring (obviously).

            Thanks -- good luck with your challenge.

            Comment


              #7
              Re: okay, now i need a little help ;-)

              Assuming you are using bash (and some other scripting languages) you can use "HERE Documents" to automate FTP and other similar things. A simple example would be:

              Code:
              #!/bin/bash
              
              site=ftp.example.com
              username=user
              passwd=password
              backupdir=$HOME
              filename="backup-$(date '+%F-%H%M').tar.gz"
              
              echo "Creating a backup file $filename of $backupdir."
              
              # Make a tar gzipped backup file
              tar -cvzf "$filename" "$backupdir"
              
              ftp -in <<EOF
              open $site
              user $username $passwd
              bin
              put $filename 
              close 
              bye
              EOF
              More details on how HERE Documents work in bash can be found here: http://tldp.org/LDP/abs/html/here-docs.html

              The basic idea is the shell pipes the text of the HERE document into the command until it finds the "end marker" which was specified right after the original "<<". The magic of course is that you can use variables in the HERE document to control what gets piped into FTP.

              You can also automate some stuff with FTP by using a .netrc file in your home directory. Just google for "ftp .netrc" for information on that.

              Let me know if that's not clear and I'll provide more details.

              Comment


                #8
                Re: okay, now i need a little help ;-)

                Originally posted by tnorris
                Assuming you are using bash (and some other scripting languages) you can use "HERE Documents" to automate FTP and other similar things. A simple example would be...
                That's actually pretty close to what I use now but there are a couple of issues. First, we're moving files in the wrong direction - it's mget that's not working, not put

                My script works, it just doesn't work in Natty or Squeeze.

                Second, since ftp isn't a shell (well, it is in the strictest sense) you can't pass variables to it. I found it kinda interesting that you can authenticate to an ftp daemon using shell variables but once you're logged on $filename doesn't work.

                I read a couple of references that suggested putting the variable in parentheses or curly braces like this

                ${filename}

                but no joy there either.

                I do really appreciate the effort you put into this, though

                thanks -
                we see things not as they are, but as we are.
                -- anais nin

                Comment

                Working...
                X