Announcement

Collapse
No announcement yet.

A lot of duplicate files in dolphin

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    [SOLVED] A lot of duplicate files in dolphin

    Hi I don't know how they got there but I have a lot of duplicate files in Documents folder, some are ending in [1], some ~ and some are capitalised. I can use fslint to find duplicate files but it would take a long time to delete each duplicate. Any idea what is causing this? More important how to search for duplicate files recursively and delete them? Maybe find files ending in ~ or [1] and delete them recursively.

    thanks

    #2
    The ~

    The ~ is used as a backup file suffix:



    The Kate/Kwrite text editor has an option enable/disable the backup files:



    The Dolphin file manager has an option to filter files:



    Dolphin > Settings > Configure Dolphin > Startup: Show filter bar
    Default shortcut: 'Ctrl+I'
    Last edited by Rog132; Jul 11, 2014, 10:07 AM.
    A good place to start: Topic: Top 20 Kubuntu FAQs & Answers
    Searching FAQ's: Google Search 'FAQ from Kubuntuforums'

    Comment


      #3
      Thanks Rog132 found option in kate that would help. However the options for filter does not recurse subdirectories. Also when filtering or finding [1] does not bring up anything, filtering *[1]* brings up all files with 1 in name.

      However it would take a long time going thru all folders

      Comment


        #4
        Getting better result with filter *\[1\] or command line find . -type f -name "*\[1\]" -exec ls {} \;

        Comment


          #5
          Originally posted by leonk5 View Post
          However it would take a long time going thru all folders
          Care is needed for the following!

          On the command line, **/*~ will match all files whose names end in ~, anywhere in and down from the current directory. You could carefully use it with an rm command; I'd check first with ls.

          That won't get files in hidden directories; if that's an issue you could use

          echo $(find . -name '*~')

          and if you like what you see change the echo to rm.
          Regards, John Little

          Comment


            #6
            jlittle that looks nifty and it works and fast. However the rm part does not seem to work.
            I tried
            rm $(find . -name '*~') but got errors
            rm $(find . -name '*~')
            rm: cannot remove ‘./Computing/Remove’: No such file or directory
            rm: cannot remove ‘files’: No such file or directory
            rm: cannot remove ‘recursively~’: No such file or directory



            I have had great success with following but I really dont know.

            find files with [1] and list them
            find . -type f -name "*\[1\]" -exec ls {} \;
            remove them
            find . -type f -name "*\[1\]" -exec rm -rf {} \;

            show duplicate files recursively send output to
            fdupes -S -r /home/leonk5/Documents/ >fdupes.txt

            delete all but 1st instant if looks ok
            fdupes -drN /home/leonk5/Documents/ >fdupes3.txt
            Last edited by leonk5; Jul 12, 2014, 05:47 AM.

            Comment


              #7
              I see the prob with rm $(find . -name '*~') it does not like files with spaces. When I removed spaces from the couple of files ending in ~ then it worked. eg
              echo $(find . -name '*~') gives
              ./Computing/Remove files recursively~

              but
              rm $(find . -name '*~') gives
              rm: cannot remove ‘./Computing/Remove’: No such file or directory
              rm: cannot remove ‘files’: No such file or directory
              rm: cannot remove ‘recursively~’: No such file or directory

              splitting up the file at the spaces. When I remove the spaces then it worked. If I had a lot of files which I already had that would be no good. Is there a way to tell rm to ignore spaces

              Comment


                #8
                I've never liked the -exec option to the find command; I'd rather use find to give me a list of files, maybe save it somewhere so I can look at the list, maybe patch it up, f.ex.
                find . -type f -name "*\[1\]" > x
                then when I like what's in x,
                rm $(<x)
                Or if there's spaces in the names,
                while read file;do rm "$file";done < x

                Never saw fdupes before, would be useful in some situations, thank you.
                Regards, John Little

                Comment


                  #9
                  That worked like great. There's probably a lot more to this but in my case between all the commands and advice my folders are cleaned up. Hopefully I haven't deleted anything I needed. Theres one thing about fdupes to watch out and that is if you have the same file in 2 or more folders that you want to keep then be careful because fdupes will keep one file. I will mark this as solved.

                  This also works
                  find . -type f -name "*~" -exec rm {} + of course doing this after find . -type f -name "*~" > x
                  Last edited by leonk5; Jul 12, 2014, 11:43 AM.

                  Comment

                  Working...
                  X