Welcome to the Linux Foundation Forum!

find /var/www/ -type f >> FullFileList.txt

find /var/www/ -type f >> FullFileList.txt

i try to get some structure into a poorly sorted collection of 100Million html,jpg,flv,swf,doc,pdf etc files which a spread to a a million of subfolders.

find /var/www/ -type f >> FullFileList.txt

a whole run would take about 20 hours and fill about 100 Million lines

but for some reason its always failing somewhere at 60% and filesize 2GB

i tryed it about 10 times in a row

filesystem is XFS

maybe its just one corrupted filename or read error but its not always exactly the same line were it stops

now i wonder if there is a way to continue

if not , if there is a whole other method to reach the same listing, which will also be able to continue instead of restart

thanks a lot in advance

Jonas

Comments

  • mfillpot
    mfillpot Posts: 2,177
    Have you tried updating your slocate database using the "updatedb" command, then doing the search using locate to buypass the direct find function?

    And example would be:
    for FIL in $(locate /var/www);do if [ -f "$FIL" ];then echo "$FIL">>~/files.txt; fi; done
  • joonas
    joonas Posts: 9
    hey,
    okay
    right, i did not try this, thank you
    updatedb  --database-root /var/www  --output /var/WWWMLOCATEDB
    

    question: will i be able to continue this in case it also fails at a certain point?

    it will probablly take until tommrow until i see/report the results
    the file seems growing slowlyer than it was with find and takeing less resources (maybe its restriced by a setting?)

    btw - do you guys know any opensource/free searchengine like tool which fulltext indexes the whole folder (100million files, 1million folders, 5000gb) with a high performance and make the search available to website visitors?
  • joonas
    joonas Posts: 9
    updatedb seems too slow?
    running 2 hours it got 75.000 lines only whereas find did 10 Million lines in that time
    its constantly useing only 20mb of Ram , maybe thats a setting somewhere?
  • mfillpot
    mfillpot Posts: 2,177
    I'm sorry, I did not realize the scope of your logging. With the quantity of files you have I do not know of a program or methods that can index the files in a prompter timeline.

    To my knowledge using a pre-built slocate database is the quickest method, what is slowing it down is the file verification in the find statement, unfortunately if you turn that off it will also display the directories in your output file.

    Does anyone else know of a tool that will fit his needs?
  • joonas
    joonas Posts: 9
    hey
    To my knowledge using a pre-built slocate database is the quickest method, what is slowing it down is the file verification in the find statement, unfortunately if you turn that off it will also display the directories in your output file.

    i did not even reach this step because i first have to create the database...
    to get the files from the database if it was there was no issue, even a customised regex would only take minutes.

    trys so far
    find /var/www/ -type f >> FullFileList.txt:
    writing about  5 Million lines ( files only) an hour
    
    fails somewhere at about 50million lines, mostly a similar size (maybe readerror, corrupted failname etc)
    

    updatedb  --database-root /var/www  --output /var/WWWMLOCATEDB
    
    yet one try done only, but only writing 35.000 lines an hour and process failed after 4hours :/

    Does anyone else know of a tool that will fit his needs?
    yes please :) :woohoo:
  • mfillpot
    mfillpot Posts: 2,177
    The failure may have been due to a filesize limitation, have you reviewed the size of the output files?

    This just came to me, if this indexing is failing because of size then maybe a database solution may be the best bet, it may take a while but it will be easy to search and use.
  • joonas
    joonas Posts: 9
    mfillpot wrote:
    The failure may have been due to a filesize limitation, have you reviewed the size of the output files?
    This just came to me, if this indexing is failing because of size then maybe a database solution may be the best bet, it may take a while but it will be easy to search and use.


    no, its stoped between 2 and 3 GB only.
    there is TB sized image file on the same partition...
  • mfillpot
    mfillpot Posts: 2,177
    joonas wrote:
    no, its stoped between 2 and 3 GB only.
    there is TB sized image file on the same partition...

    With that being the case then I agree with you about a corruption or disk I/O error causing the issue, which will have to be repaired prior to you completing the indexing operation.

    On a side note due the massive count of files you are trying to index, a website search can take a while going through a text file. So it went ahead and made a mysql schema and bash script that can be used for an introductory indexing, it takes a while but can be beneficial. The benefits you can get from database indexing would be querying of searches and the ability to code the site to automatically update any new files or deleted files from the database.

    Again my script can take a while, but if you are interested I can post the script here.
  • joonas
    joonas Posts: 9
    i still wonder if there isnt a way to make the find more error resistant or just continue with the file at a certain line.

    it will of course be nice to try you script! :)
  • mfillpot
    mfillpot Posts: 2,177
    I will deliver my script and mysql table creation scripts when I have some time available in the next few days.

    I personally have never seen find or slocate error out and freeze due to a corruption (I have seen corruptions before and testing indexing of the corrupt files), I am now questioning if ls would even properly display the bad file. I know the apps do have a certain level of error detection and correction, but until we know exactly what is causing the error no mechanism can be build to avoid it. Since you stated the indexing function freeze at the same point I would advise reviewing where it stopped and navigating to the location to see what file or filename is causing the failure.
  • joonas
    joonas Posts: 9
    Hello:)
    i were now able to complete a filelist with find.

    now that the filelist is done i more think about:
    creating the database
    and maybe to store text/html/css files within the database and also in less soon future ultimatly buying addtional harddrives and do a fast fulltext search......

    About your script:
    first i thought you already had a completed one, but now iam pretty impressed by your offer to help with a script you are making specialy for this issue! it must not be working already (especially since a part of the issue is gone) but i will anyways be curious to look through the code/ideas you already wrote down.

    thanks, Joonas
  • mfillpot
    mfillpot Posts: 2,177
    It is good to see that your filesystem finally stabilized. Below are my scripts to setup the schema and tables, the another script to verify the files and place them into th database, it includes a verification mechanism so you can use it for updating the index.

    As with anything else, this is my first attempt at this script and there may be better ways, but we have to start somewhere.

    Create schema and tables
    #!/bin/bash
    #
    # This script creates the mysql schema and tables to file indexing
    
    USER="root"
    PW="password"
    
    DBNAME="fileindex"
    
    mysql -u $USER --password=$PW --execute="create database $DBNAME;"
    mysql -u $USER --password=$PW --database=$DBNAME --execute="create table files (file_id int not null auto_increment primary key, dir_id int, filename varchar(100),verified bool);"
    mysql -u $USER --password=$PW --database=$DBNAME --execute="create table dirs (dir_id int not null auto_increment primary key, dirname varchar(100),verified bool);"
    

    Write the files and directories into the database
    #!/bin/bash
    
    # Index all files and directories from a directory into mysql
    
    USR="user"
    PW="password"
    DBNAME="fileindex"
    BASELOC="/var/www"
    
    
    # Mark all entries as unverified so the missing files will be tagged
    CMD="UPDATE dirs SET verified=false"
    mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    CMD="UPDATE files SET verified=false"
    mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    
    # Start storing the directories
    for LOC in $(find $BASELOC -type d)
    do
      CMD="SELECT dir_id from dirs where dirname='$LOC'"
      OUT=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
      if [ -z "$OUT" ]; then
        CMD="INSERT INTO dirs (dirname,verified) VALUES('$LOC',true)"
      else
        CMD="UPDATE dirs SET verified=true WHERE dirname='$LOC'"
      fi
      mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"  
    done
    
    # Start storing the files
    for LOC in $(find $BASELOC -type f)
    do
      DNAME=`dirname "$LOC"`
      LNAME=`basename "$LOC"`
    
      CMD="SELECT dir_id from dirs where dirname='$DNAME'"
      DNUM=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
      DNUM=`echo "$DNUM"|grep -v dir_id`
    
      CMD="SELECT file_id from files where filename='$LOC' and dir_id='$DNUM"
      OUT=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
      if [ -z "$OUT" ]; then
        CMD="INSERT INTO files (dir_id,filename,verified) VALUES('$DNUM','$LNAME',true)"
      else
        CMD="UPDATE files SET verified=true WHERE filename='$LNAME' and dir_id='$DNUM'"
      fi
      mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"  
    done
    
    # Remove the missing files and directories
    CMD="DELETE dirs.* from dirs where verified=false"
    mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    CMD="DELETE files.* from files where verified=false"
    mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    

Categories

Upcoming Training