OK, cannot find any contention issues. The process basically takes denormalized data from a source table, parses and does some calculations and then inserts the normalized record into a fact table. The approximate number of records that are processed at any one time is ~25000. Out of those 25000 records we only actually insert or update 1-2 thousand records in the fact table.
This process takes less than a minute on my development server, which has less processing power, less ram, etc., and is currently taking over 6 hours on my production server. Indexes are the same, there is 6GB of free space on the production server, which is more than enough, and other loading processes are not affected.
Any suggestions on what I can look for to help diagnose this?