JSON_VALUE and an array

  • Hi,

    I have a problem. My JSON stored is for exemple :
    - [{"Code":"001", "Valeur":"AAA"}, {"Code":"002", "Valeur":"BBB"}, {"Code":"003", "Valeur":"CCC"}, {"Code":"004", "Valeur":"DDDD1"}]
    - Or [{"Code":"001", "Valeur":"AAA2"}, {"Code":"002", "Valeur":"BBB"}, {"Code":"004", "Valeur":"DDDD2"}]

    I want to get in two columns the value of the property "Valeur" for "Code" = "001" and "code" = "004";

    How can i do this ?

    Initialy i used :

    "SELECT JSON_VALUE(StructureData_RubriquesString,'$[0].Valeur') as Nom, JSON_VALUE(StructureData_RubriquesString,'$[3].Valeur') as DateNaissance From StructureData"

    But sometimes i don't have "Code":"003" so $[3] don't retrun the correct value

    How can i specify with sql and "JSON_VALUE" that i want only value "Valeur" for "Code" = "001" and "004" in two distinct columns ?

    Nom |DateNaissance
    AAA |DDDD
    AAA3|DDDD2

    Best regards


  • Hi,

    This is a probable but not so elegant way. Does this give any hint to you?

    DECLARE @TestValue VARCHAR(MAX) = '[{"Code":"001", "Valeur":"AAA"}, {"Code":"002", "Valeur":"BBB"}, {"Code":"003", "Valeur":"CCC"}, {"Code":"004", "Valeur":"DDDD1"}]'

    ;WITH ValueCTE(Valeur, Code) AS
    (
    SELECT A.Valeur, A.Code
    FROM
    (
        SELECT Code, Valeur
        FROM OPENJSON(@TestValue)
        WITH([Code] VARCHAR(10) '$.Code', [Valeur] VARCHAR(10) '$.Valeur')
    )A
    WHERE A.Code IN('001', '004')
    )
    SELECT
    (SELECT ValueCTE.Valeur FROM ValueCTE WHERE Code = '001') AS Nom,
    (SELECT ValueCTE.Valeur FROM ValueCTE WHERE Code = '004') AS DateNaissance

    Please let me know.

  • Thanks for your solution. The developer is using substring function in sql to solve this case but it's too slow so we are testing to use JSON functions. We thought we could optimize our SQL but it seems it's not possible with the structure of the data we have in our database.

  • mcosani - Tuesday, June 26, 2018 1:39 AM

    Thanks for your solution. The developer is using substring function in sql to solve this case but it's too slow so we are testing to use JSON functions. We thought we could optimize our SQL but it seems it's not possible with the structure of the data we have in our database.

    I would have to wonder about the utility of JSON given that storing such in a database kind of defeats the purpose of having an RDBMS in the first place.  As using anything that parses either XML or JSON isn't going to be terribly fast, especially in any volume, I continue to be amazed that folks continue to try and use it for anything other than very small volume things.   That may or may not be your situation, but it is important to remember the limitations of a given tool so that solutions designed around it can actually work well.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • I think so too, but it's another entreprise who did the developpment, now we have to find a way to optimize this application.

  • mcosani - Tuesday, June 26, 2018 7:50 AM

    I think so too, but it's another entreprise who did the developpment, now we have to find a way to optimize this application.

    So is the database actually storing large volumes (number of rows) of JSON in a given column?    If so, you might want to consider re-writing the JSON pieces to use a normalized data structure.   Optimization opportunity may be exrremely minimal without having a normalized data structure in place.   There's only so much you can do with having to parse a column to get to the values.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • Hi,

    Thanks for your reply. I confirm, a large volumes of JSON in a given column is stored in our Database. We are already thinking to re-write the JSON with a logical structure like {"001":"AAA", "004":"DDDD1"}. It seems to be the only way for optimization in our case.

    Best regards

  • mcosani - Thursday, June 28, 2018 8:26 AM

    Hi,

    Thanks for your reply. I confirm, a large volumes of JSON in a given column is stored in our Database. We are already thinking to re-write the JSON with a logical structure like {"001":"AAA", "004":"DDDD1"}. It seems to be the only way for optimization in our case.

    Best regards

    It will probably help, but it may not be much...  The overhead of using the JSON function could still be high....  although, if you simplify the JSON structure, that may either simplify it sufficiently to make CHARINDEX useful, or maybe even allow you to be effective with a string splitting function like Jeff Moden's code, found at the end of the article here:  http://www.sqlservercentral.com/articles/72993/

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply