For a data conversion i created many data CSV files which i load by services defined at the top of the CSV data file, it takes about 2 hours to load all.
how can I get a summary which one failed to load for whatever reason?
If you’re loading 2 hours of data, I suggest you use other techniques than the csv file loading like typical database migration.
The way that you tell whether a data file has failed to load is during the load process, there is a logged error in red that you can look for.
I usually find the file that is the problem and upload it to the data import screen to get the exact line where the problem is.
Thank you for the answer.
If you are loading data in many CSV files it will report on which files succeed or failed (well, depending on how you do it… it will with the command line data loader or the Tools app Data Import screen when loading by type and/or component). Each file will succeed or fail independently, ie if a file fails the transaction gets rolled back and nothing in the file will have been committed to the database.
As @michael mentioned, when loading very large files it’s not a great tool. One reason is the transaction per file. If you have lots of small files it isn’t a problem, but with very large files you end up with very large and long-running transactions. In any relational database as the number of changes in a single transaction gets very large each insert or update slows down dramatically (because uncommitted data is not yet indexed, but must be checked for each operation). I’ve seen cases where the time per insert or update near the end of the file is 100x the time for each operation at the beginning of the file. It can get bad!
thanks for your reply, I have used it with about 25000 lines per file and it works fine.