• Charles Wannall (10/22/2010)


    I can see immediate usefulness in a project where I intend to use it today: I have large tables in Access database spread all over the country. Ancient app, many users. Not going away. But...must centralize the data for control and for management review and reporting. There's more involved obviously, but this little gem of an article shows me how I can organize my transmission into 5K chunks and send them to a stored procedure in SS that dumps them in a holding table - no processing except landing in the dump table. Later, I can follow up on SS with scheduled job to process rows into proper tables.

    User in Access has minimum transmission time (no waiting for inserts on server end). Data gets centralized (yes, GUIDs are added at transmission source and used throughout from then on).

    Fantastic article. Exactly what I needed at exactly the right moment.

    The collection into XML cost about 20 lines, including the routine to convert a recordset data row into an xml node.

    I would suggest two ideas, both similar.

    1. Put a IDataReader on your JetDatabase (Access .mdb file). As your fire-hose read the JetDatabase data, throw the values into a strong dataset.

    2. Every X number of rows (make it configurable and experiment).....push the strong dataset.GetXml() to a stored procedure.

    Deviation A

    1. If you care about bandwidth, zip the dataset xml and send it to a service. Or check out binary methods for serializing datasets.

    2. You could have a service which can unzip (or use raw string-xml) to process rows via bulk.

    OR

    Put your data into simple POCO objects. Add [Serializable] attribute to the class(es).

    Binary serialize and ship to Service.

    ...

    I have an older blog entry which describes the first one:

    http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!527.entry

    ...