March 14, 2013 at 3:18 am
Hello All,
I posted this in SSIS and did not get any replies. I hope this is the place I need to post this question.
I am working at a small health plan and I am trying to parse a 834 flat file data into a table using SSIS, SQL Server or VS. The file can be very large at times. One of my problems is the data is separated by { and then abbreviated words to inform you of what type of data is coming next. Example ~AMT{
After ~AMT{ will be the copay amount for the member.
834 Benefit Enrollments and Maintenance
Functional Group ID: BE
~INS - Insured Benefit OR MEMBER LEVEL DETAIL
~REF - Reference Information: The REF segment is required to link the dependent(s) to the subscriber.
~DTP - Date or Time or Period
~NM1 - Individual or Organizational Name: Either NM1 or N1 will be included depending on whether an individual or organization is being specified.
~PER - Administrative Communications Contact
~N3 - Party Location
~N4 - Geographic Location
~DMG - Demographic Information
~LS - Loop Header
EC - Employment Class
~LS - Loop Header
~LX - Transaction Set Line Number
~N1 - Party Identification
~REF - Reference Information
~AMT - Monetary Amount Information
{C1 - CO-PAYMENT AMOUNT
~COB - COORDINATION OF BENEFITS
~LX - Transaction Set Line Number
~N1 - Party Identification
~DTP - Date or Time or Period: REPORTING CATEGORY DATE
~LE - Loop Trailer: ADDITIONAL REPORTING CATEGORIES LOOP TERMINATION
~SE - Transaction Set Trailer: SE is the last segment of each transaction set.
Here is an edited segment of what I am trying to enter into a table.
~INS{Y{18{001{43{A{C{{AC~REF{0F{A00000000~DTP{000{D8{yyyymmdd~NM1{IN{1{LastName
{FirstName{Initial~PER{IP{{HP{phonenumber{TE{phonenumber~N3{MONTECITO{streetaddress~N4{city
{state{zipcode{{CY{13~DMG{D8{DateOfBirth{GenderMorF{I{H~LS{2700~LX{1~N1{75{ACTION CODE~REF{ZZ{AC~LX{2~N1{75{RENEWAL DATE~DTP{007{D8{DOB~LE{2700
the total length of the enrollment segments is 60,000 characters so entering it into a table is not possible or not that I could do without cutting the file in half.
I would like to enter the data as such.
ActionCode 001 -- means maintenance
Planid A00000000
DOB mmddyyyy
etc.
but I would be okay with
col1 INS
col2 y
col3 18
col4 001
col5 43
col6 a
col7 c
col8 ac
etc.
I would appreciate any help or assistance, I have searched Google and have not found a good solution.
Thanks in advance and have a great day!
Cheers!
The pain of Discipline is far better than the pain of Regret!
March 14, 2013 at 10:53 am
I do not typically recommend third-party solutions for SSIS, but in this case it's probably not worth re-inventing the wheel:
http://www.cozyroc.com/ssis/edi-source
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
March 14, 2013 at 12:00 pm
Thank you for the reply and the recommendation. It is a pretty expensive option but I am starting to think if we need to truly parse the file correctly it might be the only way to go.
I know I have made several attempts to create something with SSIS and then a SQL custom text function and it has not been working out to well.
I also thought to try VS with something like this:
using (TextReader rdr = new StreamReader(fullFilePath))
{
string line;
while ((line = rdr.ReadLine()) != null)
{
string[] lineChars=line.ToCharArray();
foreach(char c in lineChars)
{
switch(c)
{
case startOfWord1:
//change the state of word you are populating something like
// Build(word1)
case EndOfWord1:
//Terminate appending to work1 something like
//write word1 to persistent storage or in memory collection etc abd say Word1 done
and so on and so forth for other words
}
}
}
}
Thanks again for your time I greatly appreciate it.
The pain of Discipline is far better than the pain of Regret!
Viewing 3 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy