Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Iterating thru a Tables columns Expand / Collapse
Author
Message
Posted Tuesday, November 27, 2012 12:03 PM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:51 PM
Points: 13,126, Visits: 11,963
richard.noordam (11/27/2012)
first of all i was NOT looking for a specific solution. So your comment, and snarkyness are not appreciated, nor warranted.


If I came across as snarky I apologize. I think perhaps you misunderstood my coughing around a cursor.


I was indeed looking for a pattern of behavior to apply, to a problem i have, which this problem is a very minor piece of the whole problem that i'm trying to solve.

I was already using a coalesce statement with a select to generate the column list, and the solution wasn't to put the entire set of code here, and ask that it be tore apart. (FYI: another part is a crosstab using said column list, but that is outside of what i'm asking, which works awesome btw). Mainly because the problem that I asked about here can be stated simply.

What is the best way to deal with a set of rows and determine/return the first occurance of a non-zero column value in each row, using a set of column fields that are determined at runtime?

There should be a 'best-practice' on this, i would think, maybe not.
in any case.... there should be a pattern of behavior in assessing and attempting to solve, that doesn't rely on specific data.


You are looking for best practices in dealing with data is denormalized. I get that you have done this via dynamic cross tab but there isn't really a normal way of handling this. The best chance is the coalesce. An even better place in the process is to get your data the way you want it when you crosstab it initially.


So datasets were 'extra' and unnecessary, even though I provided in the second post.


What you posted is not a dataset. Nobody can load that into a table without first writing all the insert statements and create table statements. Keep in mind that we are all volunteers around here. When you don't post stuff in a consumable format the best anybody can do is take a shot in the dark.

Your post that contained your solution sounded very much like you wanted help with a better approach. You included selecting from a view (that we don't have), then updating a table (that we don't have). It seems that you want a magical solution to a problem without providing the details needed to solve it.




_______________________________________________________________

Need help? Help us help you.

Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

Need to split a string? Try Jeff Moden's splitter.

Cross Tabs and Pivots, Part 1 – Converting Rows to Columns
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs
Understanding and Using APPLY (Part 1)
Understanding and Using APPLY (Part 2)
Post #1389363
Posted Tuesday, November 27, 2012 12:05 PM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Wednesday, January 16, 2013 4:23 PM
Points: 415, Visits: 2,333
richard.noordam (11/27/2012)

What is the best way to deal with a set of rows and determine/return the first occurance of a non-zero column value in each row, using a set of column fields that are determined at runtime?

In my opinion best practice here would be to do the least clever option.

That said, doesn't it really depend on how responsive you need your system to be? Ie., if a new column appears, would you like your code to immediately change to adapt, or will hand editting a routine be enough?

I have a routine that checks for column additions or changes for a few tables but it doesn't do much more than alert me to the fact I need to update a few routines. I could make it dynamic of course, but I worry about leaving the company with too much "clever" programming. It might be a borderline concern.

On the other hand, I have a system I threw together with m4 the macro processor that I can supply a server name, database name, network share, and it produces a hardwired set of backup routines, I actually did it that way because we like nice static backup t-sql routines but I got tired of doing the manual edits for each new database.


There should be a 'best-practice' on this, i would think, maybe not.
in any case.... there should be a pattern of behavior in assessing and attempting to solve, that doesn't rely on specific data.


I've never had much luck discussing interesting and maybe generalized topics here and have for the most part just decided to stick with answering a few posts now and then, but maybe you'll fare better. Best of luck in any case!
Post #1389364
Posted Tuesday, November 27, 2012 12:18 PM
Right there with Babe

Right there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with Babe

Group: General Forum Members
Last Login: Monday, April 14, 2014 3:54 PM
Points: 727, Visits: 425

You are looking for best practices in dealing with data is denormalized. I get that you have done this via dynamic cross tab but there isn't really a normal way of handling this. The best chance is the coalesce. An even better place in the process is to get your data the way you want it when you crosstab it initially.


This what i was looking for.


What you posted is not a dataset. Nobody can load that into a table without first writing all the insert statements and create table statements. Keep in mind that we are all volunteers around here. When you don't post stuff in a consumable format the best anybody can do is take a shot in the dark.

Your post that contained your solution sounded very much like you wanted help with a better approach. You included selecting from a view (that we don't have), then updating a table (that we don't have). It seems that you want a magical solution to a problem without providing the details needed to solve it.


i didn't need nor want a specific solution to this 'data', and the 'data' was and 'example set', and not a 'working set'. i was indeed looking for a more 'patterned' approach to do this, more of a WAY to solve this type of problem rather than the supposed 'magical' fix.

Apparently, i was already doing it.

the iterative approach through the columns works, and does indeed allow me to generate the YRQStart (first occurance column), because of this iterative approach.... but dang it it uses a cursor, and anytime anyone uses a cursor... i hear a 'it should be a set based approach'.

i am unclear how i can clear that point up, if you would like to expound on it?
Post #1389369
Posted Tuesday, November 27, 2012 12:55 PM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:51 PM
Points: 13,126, Visits: 11,963
Due the denormalized data and the very unique situation it is impossible to give a generic this is a set based way to do this. If what you have is satisfactory to you then I say go for it.

_______________________________________________________________

Need help? Help us help you.

Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

Need to split a string? Try Jeff Moden's splitter.

Cross Tabs and Pivots, Part 1 – Converting Rows to Columns
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs
Understanding and Using APPLY (Part 1)
Understanding and Using APPLY (Part 2)
Post #1389386
Posted Wednesday, November 28, 2012 1:23 PM
Right there with Babe

Right there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with BabeRight there with Babe

Group: General Forum Members
Last Login: Monday, April 14, 2014 3:54 PM
Points: 727, Visits: 425
i do have to say, you should give yourself a bit more credit, as our private message indicated you did understand my problem, and did suggest a potential solution, which did indeed change the approach i was using for that section of the problem....

to a 75% decrease in run time as well.

and it's beginning to apease my need of a bit more elegant solution. ;)

THANKS!!!
Post #1390092
Posted Wednesday, November 28, 2012 1:35 PM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:51 PM
Points: 13,126, Visits: 11,963
richard.noordam (11/28/2012)
i do have to say, you should give yourself a bit more credit, as our private message indicated you did understand my problem, and did suggest a potential solution, which did indeed change the approach i was using for that section of the problem....

to a 75% decrease in run time as well.

and it's beginning to apease my need of a bit more elegant solution. ;)

THANKS!!!


Thanks Richard. It still uses a cursor but it is better at least. As I said in our PM the problem in helping come up with a generic solution is the language required to explain it is sql. And since we didn't have ddl to work with I couldn't write said sql. It would be really cool of you to post the solution I sent you (I didn't save a copy of it myself). That way anybody in the future who stumbles on this thread will be able to see a way to solve it.


_______________________________________________________________

Need help? Help us help you.

Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

Need to split a string? Try Jeff Moden's splitter.

Cross Tabs and Pivots, Part 1 – Converting Rows to Columns
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs
Understanding and Using APPLY (Part 1)
Understanding and Using APPLY (Part 2)
Post #1390101
Posted Thursday, December 6, 2012 5:37 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 5:21 AM
Points: 1,502, Visits: 2,671
Here is an example of a possible method you could use:
-- set up sample table with data
IF OBJECT_ID('dbo.my_table') IS NOT NULL
DROP TABLE dbo.my_table
GO

CREATE TABLE dbo.my_table
(
ID int identity(1,1) NOT NULL,
F1 varchar(4),
F2 varchar(4),
F3 varchar(4),
F4 varchar(4),
F5 varchar(4)
)

SET NOCOUNT ON

INSERT my_table (F1,F2,F3,F4,F5) SELECT '0','0','9673','0','9781'
INSERT my_table (F1,F2,F3,F4,F5) SELECT '0','9672','9673','0','0'
INSERT my_table (F1,F2,F3,F4,F5) SELECT '0','9672','9673','9674','0'
INSERT my_table (F1,F2,F3,F4,F5) SELECT '0','0','0','9674','9781'

SELECT * FROM my_table
-----------------------------
-- build up sql statement & execute
DECLARE
@my_columns nvarchar(MAX),
@sql nvarchar(MAX)

SELECT @my_columns =
COALESCE(LTRIM(RTRIM(@my_columns)) + ',' ,'') + 'NULLIF([' + c.name + '],''0'')'
FROM
sys.tables t
INNER JOIN sys.columns c
ON t.object_id = c.object_id
INNER JOIN sys.types st
ON c.system_type_id = st.system_type_id
WHERE
t.name = 'my_table'
AND st.name = 'varchar'
AND c.max_length = 4
AND c.column_id >= 2
ORDER BY
c.column_id

SELECT @my_columns = 'COALESCE(' + @my_columns + ')'

SET @sql = 'SELECT ID, ' + @my_columns + ' AS my_value FROM my_table'

PRINT @sql

EXEC sp_executesql @sql


Post #1393460
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse