Welcome to the Linux Foundation Forum!

find /var/www/ -type f >> FullFileList.txt

find /var/www/ -type f >> FullFileList.txt

i try to get some structure into a poorly sorted collection of 100Million html,jpg,flv,swf,doc,pdf etc files which a spread to a a million of subfolders.

find /var/www/ -type f >> FullFileList.txt

a whole run would take about 20 hours and fill about 100 Million lines

but for some reason its always failing somewhere at 60% and filesize 2GB

i tryed it about 10 times in a row

filesystem is XFS

maybe its just one corrupted filename or read error but its not always exactly the same line were it stops

now i wonder if there is a way to continue

if not , if there is a whole other method to reach the same listing, which will also be able to continue instead of restart

thanks a lot in advance

Jonas

Welcome!

It looks like you're new here. Sign in or register to get started.
Sign In

Comments

  • Posts: 2,177
    Have you tried updating your slocate database using the "updatedb" command, then doing the search using locate to buypass the direct find function?

    And example would be:
    for FIL in $(locate /var/www);do if [ -f "$FIL" ];then echo "$FIL">>~/files.txt; fi; done
  • Posts: 9
    hey,
    okay
    right, i did not try this, thank you
    1. updatedb --database-root /var/www --output /var/WWWMLOCATEDB

    question: will i be able to continue this in case it also fails at a certain point?

    it will probablly take until tommrow until i see/report the results
    the file seems growing slowlyer than it was with find and takeing less resources (maybe its restriced by a setting?)

    btw - do you guys know any opensource/free searchengine like tool which fulltext indexes the whole folder (100million files, 1million folders, 5000gb) with a high performance and make the search available to website visitors?
  • Posts: 9
    updatedb seems too slow?
    running 2 hours it got 75.000 lines only whereas find did 10 Million lines in that time
    its constantly useing only 20mb of Ram , maybe thats a setting somewhere?
  • Posts: 2,177
    I'm sorry, I did not realize the scope of your logging. With the quantity of files you have I do not know of a program or methods that can index the files in a prompter timeline.

    To my knowledge using a pre-built slocate database is the quickest method, what is slowing it down is the file verification in the find statement, unfortunately if you turn that off it will also display the directories in your output file.

    Does anyone else know of a tool that will fit his needs?
  • Posts: 9
    hey
    To my knowledge using a pre-built slocate database is the quickest method, what is slowing it down is the file verification in the find statement, unfortunately if you turn that off it will also display the directories in your output file.

    i did not even reach this step because i first have to create the database...
    to get the files from the database if it was there was no issue, even a customised regex would only take minutes.

    trys so far
    1. find /var/www/ -type f >> FullFileList.txt:
    2. writing about 5 Million lines ( files only) an hour
    3.  
    4. fails somewhere at about 50million lines, mostly a similar size (maybe readerror, corrupted failname etc)

    1. updatedb --database-root /var/www --output /var/WWWMLOCATEDB
    yet one try done only, but only writing 35.000 lines an hour and process failed after 4hours :/

    Does anyone else know of a tool that will fit his needs?
    yes please :) :woohoo:
  • Posts: 2,177
    The failure may have been due to a filesize limitation, have you reviewed the size of the output files?

    This just came to me, if this indexing is failing because of size then maybe a database solution may be the best bet, it may take a while but it will be easy to search and use.
  • Posts: 9
    mfillpot wrote:
    The failure may have been due to a filesize limitation, have you reviewed the size of the output files?
    This just came to me, if this indexing is failing because of size then maybe a database solution may be the best bet, it may take a while but it will be easy to search and use.


    no, its stoped between 2 and 3 GB only.
    there is TB sized image file on the same partition...
  • Posts: 2,177
    joonas wrote:
    no, its stoped between 2 and 3 GB only.
    there is TB sized image file on the same partition...

    With that being the case then I agree with you about a corruption or disk I/O error causing the issue, which will have to be repaired prior to you completing the indexing operation.

    On a side note due the massive count of files you are trying to index, a website search can take a while going through a text file. So it went ahead and made a mysql schema and bash script that can be used for an introductory indexing, it takes a while but can be beneficial. The benefits you can get from database indexing would be querying of searches and the ability to code the site to automatically update any new files or deleted files from the database.

    Again my script can take a while, but if you are interested I can post the script here.
  • Posts: 9
    i still wonder if there isnt a way to make the find more error resistant or just continue with the file at a certain line.

    it will of course be nice to try you script! :)
  • Posts: 2,177
    I will deliver my script and mysql table creation scripts when I have some time available in the next few days.

    I personally have never seen find or slocate error out and freeze due to a corruption (I have seen corruptions before and testing indexing of the corrupt files), I am now questioning if ls would even properly display the bad file. I know the apps do have a certain level of error detection and correction, but until we know exactly what is causing the error no mechanism can be build to avoid it. Since you stated the indexing function freeze at the same point I would advise reviewing where it stopped and navigating to the location to see what file or filename is causing the failure.
  • Posts: 9
    Hello:)
    i were now able to complete a filelist with find.

    now that the filelist is done i more think about:
    creating the database
    and maybe to store text/html/css files within the database and also in less soon future ultimatly buying addtional harddrives and do a fast fulltext search......

    About your script:
    first i thought you already had a completed one, but now iam pretty impressed by your offer to help with a script you are making specialy for this issue! it must not be working already (especially since a part of the issue is gone) but i will anyways be curious to look through the code/ideas you already wrote down.

    thanks, Joonas
  • Posts: 2,177
    It is good to see that your filesystem finally stabilized. Below are my scripts to setup the schema and tables, the another script to verify the files and place them into th database, it includes a verification mechanism so you can use it for updating the index.

    As with anything else, this is my first attempt at this script and there may be better ways, but we have to start somewhere.

    Create schema and tables
    1. #!/bin/bash
    2. #
    3. # This script creates the mysql schema and tables to file indexing
    4.  
    5. USER="root"
    6. PW="password"
    7.  
    8. DBNAME="fileindex"
    9.  
    10. mysql -u $USER --password=$PW --execute="create database $DBNAME;"
    11. mysql -u $USER --password=$PW --database=$DBNAME --execute="create table files (file_id int not null auto_increment primary key, dir_id int, filename varchar(100),verified bool);"
    12. mysql -u $USER --password=$PW --database=$DBNAME --execute="create table dirs (dir_id int not null auto_increment primary key, dirname varchar(100),verified bool);"

    Write the files and directories into the database
    1. #!/bin/bash
    2.  
    3. # Index all files and directories from a directory into mysql
    4.  
    5. USR="user"
    6. PW="password"
    7. DBNAME="fileindex"
    8. BASELOC="/var/www"
    9.  
    10.  
    11. # Mark all entries as unverified so the missing files will be tagged
    12. CMD="UPDATE dirs SET verified=false"
    13. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    14. CMD="UPDATE files SET verified=false"
    15. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    16.  
    17. # Start storing the directories
    18. for LOC in $(find $BASELOC -type d)
    19. do
    20. CMD="SELECT dir_id from dirs where dirname='$LOC'"
    21. OUT=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
    22. if [ -z "$OUT" ]; then
    23. CMD="INSERT INTO dirs (dirname,verified) VALUES('$LOC',true)"
    24. else
    25. CMD="UPDATE dirs SET verified=true WHERE dirname='$LOC'"
    26. fi
    27. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    28. done
    29.  
    30. # Start storing the files
    31. for LOC in $(find $BASELOC -type f)
    32. do
    33. DNAME=`dirname "$LOC"`
    34. LNAME=`basename "$LOC"`
    35.  
    36. CMD="SELECT dir_id from dirs where dirname='$DNAME'"
    37. DNUM=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
    38. DNUM=`echo "$DNUM"|grep -v dir_id`
    39.  
    40. CMD="SELECT file_id from files where filename='$LOC' and dir_id='$DNUM"
    41. OUT=`mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"`
    42. if [ -z "$OUT" ]; then
    43. CMD="INSERT INTO files (dir_id,filename,verified) VALUES('$DNUM','$LNAME',true)"
    44. else
    45. CMD="UPDATE files SET verified=true WHERE filename='$LNAME' and dir_id='$DNUM'"
    46. fi
    47. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    48. done
    49.  
    50. # Remove the missing files and directories
    51. CMD="DELETE dirs.* from dirs where verified=false"
    52. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"
    53. CMD="DELETE files.* from files where verified=false"
    54. mysql -u $USR --password=$PW --database=$DBNAME --execute="$CMD"

Welcome!

It looks like you're new here. Sign in or register to get started.
Sign In

Welcome!

It looks like you're new here. Sign in or register to get started.
Sign In

Categories

Upcoming Training