• A single 5-million row INSERT can be a huge problem, it may fill up the transaction log even if the database is in simple recovery.  A minimally-logged bcp insert is probably the best way to handle it as one file while minimizing this impact.

    The process would run fastest if the flat file path is on a separate drive than the SQL data files on either server.

    If the network is a serious bottleneck, you could try exporting to a file on the source server, copy the file to the destination, and import it from there.  It is the same number of bytes transferred, but I have seen cases where Windows file transfer between servers uses the network more efficiently than SQL Server I/O.  

    You could split the source file into pieces in SSIS in spite of not having an index or key column.  Create a set of flat file destinations, and use a source query of "SELECT *, FileNumber = (ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) % n) FROM table" (where n = number of files).  Then use a Conditional Split component to distribute the rows by the FileNumber column.  This requires one table scan at the source so it shouldn't make the export take longer, but you may have more luck loading multiple files rather than one big one.  (You could also use a Conditional Split Script Transform, where you increment a counter and redirect the rows with relatively simple C# code.  The number of files wouldn't have to be hardcoded in the script, it could simply use the number of outputs.)