Simple "IF" Logic in Package

  • Here is my plan:

    I want to bring a bunch of data from our AS400 to SQL Server 2000 each night. Sometimes, the AS400 may be unreachable for any reason.

    Currently, I have a task in my package that "test" the connection to the AS400 just to see if it responds. This works great.

    However, my current problem is even if the connection is working, the files may be empty or DB2 may not be responding at the time, which makes the tables appear to be empty.

    For whatever reason, I'd like to know if there is a very simple way to check if the source AS400 table contains data and, if not, don't clear the related SQL Server table.

    Is there a simple way I can do:

    "If there are no records, then, fail the current task"?

    Will I have to resort to an ActiveX task? If so, what is the syntax for correctly "failing" a task?


  • Take a look at the following,

    Do you want to fail the task or simply not execute it?

    Hope this helps

    Phill Carter


    Colt 45 - the original point and click interface

    Colt 45 - the original point and click interface

  • Thanks for the link!

    My goal is to fail the task so the On Failure workflow constraint is used rather than the On Success constraint.

    I made a simple working task using an ActiveX task. This task performs a simple record count on the source table. If the table has more than zero records, I set the task to Succeed. If not, I set the task to Failure.

    This seems to work perfectly for my needs.

    Regardless, thanks for the help!


  • md40,

    phillcarts pointing you in the right direction; SQLDTS has some great techiniques for control of flow and looping.

    I'm really interested in what you said about bringing data from the AS400 to SQL Server. Can you give me a quick summary of how you're doing it? I'm interested in doing the same thing at my job, but I'm not sure about the best way to proceed. Any advice you have would be appreciated.

    Signature is NULL

  • Our 400 is a big, sloppy mess of data. There are no constraints, integrity, indices, etc. The only things you might find are primary keys that may be one or multiple columns, but that's about it.

    In addition, the 400 goes down nightly and on weekends for backups and IPLs. I maintain our university's intranet site and I can't afford to have my site unavailable for hours while the 400 is doing it's stuff.

    Therefore, I am building a warehouse, so to speak, by bringing data from the 400 to SQL. When I bring the data down (using DTS) I do some cleansing of the data as necessary. For example, our gender field contains M, F and any/every other character you may imagine.

    To do the data transfer, I installed Client Access on the SQL box. I setup some jobs that kick-off some DTS packages each night when I know the 400 is up and running.

    I have added some logic to each package to do a sample query to ensure the connection is available.

    Unfortunately, w/o major cleansing, I can't apply referential integrity constraints to my tables. The 400 data is too far gone with thousands of orphaned records.

    So, I just bring back what I can, cleanse what I can and use the resulting data.

    Do you have any specific questions about 400 -> SQL conversions?

  • Yeah...I'm not so interested in the data warehousing side; I've got that down pat.

    The main this I'm insterested in is how most people connect to an AS400 file and transfer the data to SQL. Do you use OLE DB? Or is it ODBC or some other connection utility?

    I know zero about how one would transfer the data from one system to the other. Any pointers you would have would be appreciated. Of course, I should probably just do my homework, but I haven't had time to research this at all.

    Signature is NULL

  • Hi Calvin,

    You can use MS Host Integration Server2000 for the job. It comes with utilities and drivers for the purpose. If you are going to work with AS400 files rather than DB2, i know it has the capability but never used it. I use the OleDB driver for Db2 that ships with it. Just install the HIS200 Client part of the CD and use it to connec to to DB2 as any other database. If you use TCP/IP it is just as simple as connecting to SQL , only thing to note is that "database" to DB2 means the subsystem name, not the actual database as we know it in SQL. The TCP port alos defers from amongst the different DB2 releases, so get the DB2 admin involoved.

    If you use IBM SNA network protocols instead, then , it is a lot more complicated. You need to install the MS HIS2000 server to "gateway" you into the host VTAM/DB2 machine.

    You'l need to configure HIS2000 server to ad a VTAM "connection" for which you have to find out the VTAM Physical LU name assigned to you by the VTAM admin and do not forget that you need install the MS DLC driver[if you use win2K or Win2K3 as well on the HIS2000 Server. It alos ships with the HIS2000 server CD. You'l need to configure the HIS2000 Local LU which can be name you like. The DB2 admin needs to supply you the DB2 subsystem remote LU name and the host mode name. After that you need to configure HIS2000 to give privileges to the server users[normaly Administrator] to use the HIS2000 after creating users from within HIS2000.

    Then you will need to run a utility supplied by MS on the same HIS2000 CD, called, createpkg or something similar. You will have to get the DB2 sysadmin to use it to connectio to DB2 and create the packages needed on DB2 to manage the isolation level packages needed by OleDB and grant you permissions to use such packages.

    Once that is done, use the HIS2000 CD client installation to load the MS OleDB driver on the SQL server[not needed if you install the HIS2000 SERVER on the same SQL machine]. You will still need a lot of info from the DB2 admin:

    --Remote LU Name[ db2 sybsystem name as defined in VTAM]

    --DB2 codepage

    --DB2 Package Collection name[as created by crtpkg mentioned above]

    --DB2 location name referred to as syslocation in DB2

    --DB2 database name , normally the same as syslocation or location name

    --DB2 default schema name normally SYSIBM or your own user name or the DB2 owner name of the tables you're accessing

    IBM ships to its costomers a product called IBM PErsonal Communicatios / DB2 Connect CDs that have an ODBC drive. You can use it instead of MS OleDB but i find it slower and you need the same same stept above to configure it with SNA. With both OleDB and ODBC, if you use TCP/IP all you need is:

    DB2 IP Address and port and database name , again , known as syslocation in DB2, but in the case IBM ODBC you do not need syspackage as it uses IBM defaults although note that if you have a non default install on the db2 host then the DB2 admin needs to use the IBM DB2 connect to configure the packages.

    Use the OLEDb driver from within DTS packages to access DB2 by creating connections or using DTS ActiveX scripts. We always use SQL Linked Servers for DB2 to access the host data using SQL server OpenQuery or OpenrowSet and less frequently use the fully qualified names capabiliy provided by SQL. Its format is "Select * from LinkedServerName.Db2SubSystemName.DB2OwnerName.Db2TableName", though this is slower and if you use it access large DB2 tables is is unusable, because it fetches the entire DB2 table locally then operate on it.

    You can use it even to delete , update or insert data. OpenQuery alos can be used to insert or delete from DB2 as in, "delete from OpenQuery(db2linkedservername, "select *( from db2tablename")....

    Search SQL server Books Online for "sp_addlinkedserver" and "openquery" for more details.

  • For me, it's very simple. I install IBM's Client Access on the SQL Servers. This installs ODBC drivers.

    You then need to open the Operations Navigator to create a new connection to the 400. All you need to do is supply the AS400's computer name or IP address.

    Then, setup a system DSN for the 400's ODBC connection.

    In your DTS package when you create a new connection, scroll to the top of the list of connections and you will see the driver for the 400. You can use the "Client Access ODBC Driver (32-bit)" or "Other (ODBC Data Source)" drivers.

    Other things I do:

    Created a non-human login account to connect SQL to 400.

    Ensure the account has read-only permissions on the source tables.

    Set the password to never expire for this account.

    Also, it's very important that you install the latest service packs of the Client Access!

  • Here's what I use....


    CREATE PROCEDURE dw_Check_Source_Files


    @Result varchar(250) OUTPUT




    @FileExists int

    SET @Result = ''

    set nocount on

    DECLARE @SName varchar(50), @SPath varchar(250), @Critical bit

    DECLARE SourceFile_cur CURSOR FOR

    SELECT SourceName, SourcePath, IsCritical from dw_SourceFiles

    OPEN SourceFile_cur

    FETCH NEXT FROM SourceFile_cur INTO @SName, @SPath, @Critical



    EXEC master..xp_fileexist @SPath, @FileExists OUTPUT

    -- If @FileName doesn't exist, throw a DivideByZero error

    IF @FileExists = 0 and @Critical = 1


    Select @Result = @Result + @SName + ' does not exist...Abort' + char(13)

    CLOSE SourceFile_cur

    DEALLOCATE SourceFile_cur



    if (select count(1) from sysobjects where name = '#FileDetails') > 0

    DROP TABLE #FileDetails

    CREATE TABLE #FileDetails


    AlternateName varchar(100),

    FileSize decimal(9,0),

    CreateDate varchar(10),

    CreateTime varchar(10),

    LastWriteDate varchar(10),

    LastWriteTime varchar(10),

    LastAccessDate varchar(10),

    LastAccessTime varchar(10),

    Attributes varchar(10),


    INSERT INTO #FileDetails

    EXEC master..xp_getfiledetails @SPath

    IF (SELECT FileSize FROM #FileDetails) = 0 and @Critical = 1


    Select @Result = @Result + @SName + ' is empty...Abort' + char(13)

    CLOSE SourceFile_cur

    DEALLOCATE SourceFile_cur



    DROP TABLE #FileDetails

    FETCH NEXT FROM SourceFile_cur INTO @SName, @SPath, @Critical


    CLOSE SourceFile_cur

    DEALLOCATE SourceFile_cur

    SET @Result = 'OK'



    Then I call this script in a TSQL job step...


    declare @Ret varchar(250)

    exec dw_Check_Source_Files @Result = @Ret OUTPUT

    select @Ret

    IF @Ret <> 'OK'

    select 1/0


    Not beautiful, but it does the trick.


  • ....continued....

    I use this to check if files exist and if they are zero bytes.

    If you need more advanced conditional logic, you should steer away from DTS and use the Job system in my opinion.

  • Excellent! Thank you, md40 and salim; you've got my pointed in the right direction.


    Signature is NULL

  • Calivn,,

    Sorry for my many typos...

    Just to shed more light on the subject: for those, and we do it also, who need to move a lot of data from an OLTP system using a host DB2 for the purpose of datawarehousing or to create a near online image of the host data for the purpose of reporting and reducing the overhead on the host, as we do, then IBM has a product called DBPropagator. It installs on the host DB2 system and eavesdrop on DB2 logs fetching all inserts,deletes and updates into a special tablespace called Staging Areawhich are actually DB2 tables.

    IBM has two other products[datajoiner and DB2 integrator] used in conjunction with DBPropagator that run on a windows/linux/unix server and reads all data changes from the host staging area and populate another SQL/Oracle/UDB and so on.

    You can use either of the two, but DB2 integrator is the better solution. Be ready to pay a lot though ... and look for expensive mainframe expertise ...

    We use the 2 products now , but finding experts in this area is a nightmare ...

  • Where you said check and "don't clear the related SQL tables".

    Another approach you might find useful as well is load from the AS400 into a temp table, then do your validation. The only reason this might be preferable to checking connectivity and size and such as it gives you opportunity to do data related validation before clearing the SQL table. For example, if there's ever a chance you might somehow get only half the data you expect.

    We've also found numerous cases where it is faster to import to a temp table, then insert into permanent tables, e.g. to get them into a better order, to minimize the time of lock contention in the permanent table, etc.

  • Thanks Salim. Our needs are much simpler than what the DBPropagator seems to offer. Also, if I can't do this cheap I can't do it at all.

    Basically, Transactional data is processed on the mainframe and a file(table?) is built/updated. The data then needs to be exported to sQL, which will then continue with the data processing.

    What I'm looking for is the cheapest and/or simplest way to do this.

    Signature is NULL

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply