I'm not really seeing the problems.
I've not done this task specifically but I have done similar enough to understand the overheads.
I would probably generate the class the row is read into using EF.
I've never seen that somehow generate the wrong class.
Just a click of a button and the files are re-generated using the t4 templates.
save is all it takes And very simple to do.
Other than that there is no need to ensure proper type matching.
You already have the type.
If the table changes then the only change you would need is to just generate the class again and paste into your app.
The type drives everything in my suggested approach.
Reflection is only slow when you're using it.
That would be once per run rather than once per row.
Even with a table which had many columns that overhead would be trivial.
I have a fair bit of code I use the technique of iterating properties to copy the values from one type to another.
This is for desktop apps that copy data out a data layer into a viewmodel and back again to commit changes.
I have live apps which do this with hundreds of records routinely.
I've tested with thousands of records.
I just do the reflection each record in my standard routine - because the overhead isn't worth coding round.
Static data is usually just that.
I would expect changes to table structure to be pretty rare by the time you have a live test and dev database.
But let's say that isn't the case.
You can of course get sql server to tell you about a schema change to a table.
Maybe this isn't a static table and is changing very dynamically.
You could possibly dynamically generate the type.
But there's presumably still going to be some code somewhere which is reading and writing that data.
With a particularly dynamic table you're best advised to use code first rather than database first EF.
Once you do that you have a class defines the database table.
Just put that in a common dll and reference it.
You then never need change the code which compares the two tables.