Compare Artificial Intelligence Models Built with T-SQL

By:   |   Comments   |   Related: More > Artificial Intelligence


Problem

Describe and demonstrate the operation of a couple of AI (artificial intelligence) models for making the same kind of decision. Also, present a framework for contrasting the decisions from two different AI models.

Solution

Data science is a hot topic among computer professionals. Within data science, one of the highest value techniques is how to build an AI model to make a business decision. This tip builds on a prior one, Artificial Intelligence Programming with T-SQL for Time Series Data, that introduces a relatively simple approach to estimate when to buy and sell stocks for a gain. This tip extends the approach in the earlier tip by building two new AI models with the same goal. It also presents a framework for contrasting results from the two different AI models.

The requirement of contrasting two different approaches to modeling the same kind of outcome is a typical one in data science projects. It is common for there to be multiple guesses about what the best way is to make decisions. Making time-based decisions are common in many different contexts, such as when to reorder items to restore inventory levels or when to forecast snowy roads for tomorrow's drive to work. The approaches illustrated in this tip can be adapted for any time-based decision requirement.

Time-based decisions from AI models require underlying time series data to assess a model's performance. An AI model can make time-based decisions, such as when to buy and sell stocks, based on the time series data. MSSQLTips.com offers several prior tips on how to collect time series data and organize them for use within SQL Server (here, here, and here). Building an AI model is a use case for time series data.

Predicting what's going to happen next based on what happened recently

Time series data can track a stream of values over a sequence of time periods, such as days during which a stock market is open, daily shifts during which production is tracked at a factory, or hours between weather station readings. The trend and momentum of time series values can be revealed from relationships among their exponential moving averages (emas). You can compute multiple emas for a single time series value by computing the emas over different period lengths, such as the most recent ten or thirty periods.

  • If the ten-period ema is greater than the thirty-period ema for a time series value, then the series has a rising trend on average through the current period. This is because the ten-period ema represents the average time series value for the last ten periods, but the thirty-period ema denotes the average of a longer interval with twenty additional less recent values than for the ten-period ema.
  • If the ten-period ema rises from below to above the thirty-period ema for two consecutive periods, then the underlying time series values exhibit upward momentum. That is, the most recent time series values are not just rising relative to the less recent values, but the most recent values exhibit increased growth relative to the less recent time series values.

Ema values are dependent on their underlying time series values. For example, if the most recently available time series value is greater than its ten-period ema, then the ten-period ema for the next period will rise relative to the ten-period ema for the current period. This is because the most recent time series value pulls its ema value from the next-to-most-recent period in the direction of the time series value for the most recent period.

The two AI models programmed in this tip rely on both the relationships between ema values with different period lengths as well as the impact of a time series value from a prior period on the ema value for the current period. The prior tip on AI modeling with time series values and emas demonstrated that a relationship between emas with four different lengths can impact the open price of a stock in the next time period. The four emas had period lengths of ten, thirty, fifty, and two hundred. When the ema for each shorter period length was greater than its nearest longer period length, then the emas qualified the time series value as uptrending. That is, the time series values on which the emas were computed was trending up.

In this tip, the AI models only compare emas with period lengths of ten versus thirty. At a top-line conceptual level, the two models function as follows. The code in the next section illustrates the precise process for deriving buy and sell dates as well as their associated prices.

  • In the first version of the AI model for this tip,
    • when a ten-period ema exceeds the thirty-period ema, the decision is made to buy a stock; if the stock is already bought, then no action is taken
    • the decision to sell a stock is made on the first date after a prior buy date in which the ten-period ema does not exceed the thirty-period ema; no decision is made to sell a stock if it is already sold
    • after a sell date, the search for an initial buy date begins again
  • In the second AI model for this tip,
    • an extra requirement is added to the first model regarding the value of the underlying time series value from the prior period relative to the ema value in the current period
    • a decision to buy a stock is made when the ten-period ema exceeds the thirty-period ema and the close price from the prior period exceeds the ten-period ema; again if a stock is already bought, then no action is taken for the stock
    • on the first date after a buy date when the ten-period ema does not exceed the thirty-period ema or the close price from the prior period does not exceed the ten-period ema for the current period, then a decision is made to sell a stock
    • after a sell date, the search for a buy date begins again

Programming what's going to happen next based on what happened recently

This section of the tip drills down on how to program both AI models with T-SQL. You will see in this section an adaptation of the code from the prior tip on which this tip builds. The goal of the code is to save in a SQL Server table named stored_trades the buy and sell decisions made by each model. You can think of the contents of the stored_trades table as a log of the decisions made by each model.

The code for implementing both models and saving the model decisions resides in a single script. The first portion of the script appears next.

  • This portion of the script starts by declaring for_csv_from_python as the default database. This database stores the underlying time series values used by this tip in a SQL Server table named yahoo_prices_valid_vols_only within the database's dbo schema. In fact, all the permanent objects used by the AI models reference the database's dbo schema. See this prior tip for a discussion of the source database and table.
  • The next segment of the introductory code for the models creates the date_symbol_close_period_length_ema table in the dbo schema. This table will subsequently have its five columns populated.
    • The symbol, date, and period length columns uniquely identify each row in the table.
    • The close column contains the closing price during a trading day for a symbol.
    • The last column contains ema values of closing prices for a combination of symbol, date, and period length values. Recall that there are just two period lengths for the data in this tip; one length is for the most recent ten periods, and the other length is for the most recent thirty periods.
  • The last segment of the script below creates a fresh version of the stored_trades table.
    • This table has at least two rows for each trade.
      • One row for the buy date of a trade.
      • A second row for the sell date of a trade.
    • Subsequent code illustrates how each of the two AI models populate this table.
-- specify the default database reference
use for_csv_from_python
go
 
-- Create two regular tables for the AI models
-- create 
 
-- create a fresh copy of the close_emas_trending_status_by_date_symbol
-- table to store one set of underlying time series 
-- values (close prices) and their exponential moving averages 
-- with different period lengths
begin try
   drop table [dbo].[date_symbol_close_period_length_ema]
end try
begin catch
   print '[dbo].[date_symbol_close_period_length_ema] is not available to drop'
end catch
 
create table [dbo].[date_symbol_close_period_length_ema](
   [date] [date] NOT NULL,
   [symbol] [nvarchar](10) NOT NULL,
   [close] [money] NULL,
   [period length] [float] NOT NULL,
   [ema] [money] NULL
)
 
-- add primary key constraint named symbol_date_period_length
-- to the date_symbol_close_period_length_ema table
begin try
alter table [dbo].[date_symbol_close_period_length_ema]
  drop constraint [symbol_date_period_length]
end try
begin catch
  print 'primary key not available to drop'
end catch
 
-- add a constraint to facilitate retrieving saved emas
alter table [dbo].[date_symbol_close_period_length_ema]
  add constraint symbol_date_period_length
    primary key (symbol, date, [period length]);
 
-- create a fresh copy of a table to store
-- trades for a symbol with buy and sell
-- prices and dates
begin try
   drop table [dbo].stored_trades
end try
begin catch
   print '[dbo].stored_trades is not available to drop'
end catch
 
create table [dbo].stored_trades(
   [symbol] [nvarchar](10) NOT NULL,
   [date] [date] NOT NULL,
   [open] [money] NULL,
   [high] [money] NULL,
   [low] [money] NULL,
   [close] [money] NULL,
   [volume] [bigint] NULL,
   [ema_10] [money] NULL,
   [ema_30] [money] NULL,
   [all_uptrending_emas] [int] NULL,
   [all_uptrending_emas_w_lag1_close] [int] NULL,
   [trending_status] [varchar](18) NULL,
   [trending_status_w_lag1_close] [varchar](18) NULL,
   [buy_price] [money] NULL,
   [sell_price] [money] NULL,
   [buy_price_w_lag1_close] [money] NULL,
   [sell_price_w_lag1_close] [money] NULL
   )

After the first portion of the script, five additional steps are performed to implement the two AI models and save the buy and sell decisions from each model. The script segment below displays code for the first two steps along with four declare statements. Each buy or sell decision is associated with a stock symbol and a date for when to buy or to sell the stock.

  • Before the code for step 1, there are four declare statements that can assign symbol values to the @symbol local variable.
    • The code for steps 1 through 5 is meant to be executed four times – once for each distinct symbol.
    • In the script below, the first three declare statements are commented out, but the last declare statement for @symbol is uncommented. Therefore, when running the script with this declare statement, steps 1 through 5 will be for the stock whose symbol is NFLX.
  • The code for step 1 consists of a single line of code that invokes the insert_computed_emas stored procedure with a parameter value set to the value of the @symbol local variable. This stored procedure populates the date_symbol_close_period_length_ema table for the input parameter value in @symbol.
  • A preceding tip, Exponential Moving Average Calculation in SQL Server, drills down on how to compute exponential moving averages with different period lengths. Especially examine the tip for a pair of stored procedures that computes exponential moving averages; the stored procedure names are usp_ema_computer and insert_computed_emas.
  • Step 2 adds two new columns to the original set from the date_symbol_close_period_length_ema table.
    • One column (all_uptrending_emas) is for the first AI model.
      • The column value assignment is 1 when ema_10 is greater than ema_30.
      • Otherwise, the column value assignment is 0.
    • The second column (all_uptrending_emas_w_lag1_close) is for the second AI model.
      • The column value assignment is 1 when ema_10 is greater than ema_30 and the close value for the prior trading day is greater than ema_10 for the current trading day.
      • Otherwise, the column value assign is 0.
    • The new columns along with the source data columns from the date_symbol_close_period_length_ema table are saved in the #for_@symbol_tests table.
--declare @symbol nvarchar(10) = 'NBIX'
--declare @symbol nvarchar(10) = 'ULTA'
--declare @symbol nvarchar(10) = 'BKNG'
declare @symbol nvarchar(10) = 'NFLX'

-- step 1
-- run stored procs to save computed emas for a symbol
-- this step add rows to the [dbo].[date_symbol_close_period_length_ema]
-- table with emas for the time series values with a symbol value of @symbol
exec dbo.insert_computed_emas @symbol
-- step 2
-- extract and join the ema columns (ema_10 and ema_30) one column at a time
-- to each other the @symbol local variable

-- also add all_uptrending_emas and all_uptrending_emas_w_lag1_close
-- computed fields to #for_@symbol_tests temp table
-- assign a value of 1 to all_uptrending_emas when the emas are uptrending
-- or a value of 0 otherwise
-- assign a value of 1 to all_uptrending_emas_w_lag1_close when the
-- emas are uptrending and the close value from the preceding row
-- is greater than ema_10  or a value of 0 otherwise

begin try
  drop table #for_@symbol_tests
end try
begin catch
  print '#for_@symbol_tests not available'
end catch

select 
symbol_10.*
,symbol_30.ema_30
,lag(symbol_10.[close],1) over (order by symbol_10.date) lag1_close
,
case
when
   symbol_10.ema_10 > symbol_30.ema_30 then 1
  else 0
end all_uptrending_emas
,
case
when 
  lag(symbol_10.[close],1) over (order by symbol_10.date) > symbol_10.ema_10 and
   symbol_10.ema_10 > symbol_30.ema_30 then 1
  else 0
end all_uptrending_emas_w_lag1_close

into #for_@symbol_tests
from
(
-- extract emas with 10 and 30 period lengths
select date, symbol, [close], ema ema_10
from [dbo].[date_symbol_close_period_length_ema] 
where symbol = @symbol and [period length] = 10) symbol_10

inner join

(select date, symbol, [close], ema ema_30
from [dbo].[date_symbol_close_period_length_ema] 
where symbol = @symbol and [period length] = 30) symbol_30

on symbol_10.DATE = symbol_30.DATE

The script segment below displays the code for steps 3 and 4, which are two intermediate code blocks on the way to populating the stored_trades table, which is completed in step 5. The main objective of steps 3 and 4 is to populate trending_status and trending_status_w_lag1_close fields in the results set from step 4. These field values over a three-day interval are used in step 5 to compute buy and sell decisions for each of the two AI models.

  • As you can see from the code for step 3, the trending_status and trending_status_w_lag1_close fields are created in case statements.
    • The case statement for the trending_status field value is for the first AI model that bases buy and sell decisions exclusively on emas.
    • The case statement for the trending_status_w_lag1_close field is for the second AI model. This model expands the basis for buy and sell decisions from just emas to emas plus the close price field from the prior trading day.
    • The results set from step 3 are saved in the #close_emas_trending_status_by_date_symbol table for use in step 4.
  • Step 4 joins the results set from step 3 with the contents of the yahoo_prices_valid_vols_only table, which you may recall stores the underlying time series data for this tip. The output from step 4 is saved in the #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status table. The join is based on a match of symbol and date from each of the data sources contributing to step 4's results set. This results set is stored in the #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status table.
-- step 3
-- add computed trending_status fields to 
-- #close_emas_trending_status_by_date_symbol temp table
-- which is based on #for_@symbol_tests temp table

begin try
  drop table #close_emas_trending_status_by_date_symbol
end try
begin catch
  print '#close_emas_trending_status_by_date_symbol not available to drop'
end catch

declare @start_date date = '2009-01-02', @second_date date = '2009-01-05'

-- mark and save close prices and emas with ema trending_status
select 
symbol
,date
,[close]
,ema_10
,ema_30
,all_uptrending_emas
,all_uptrending_emas_w_lag1_close
,
case
 when date in (@start_date, @second_date) then NULL

 when all_uptrending_emas = 1 and 
        lag(all_uptrending_emas,1) over (order by date) = 0
        then 'start uptrend'
 when all_uptrending_emas = 1 and 
        lead(all_uptrending_emas,1) over (order by date) = 1
        then 'in uptrend'
 when all_uptrending_emas = 1 and 
        lead(all_uptrending_emas,1) over (order by date) = 0
        then 'end uptrend'
 when all_uptrending_emas = 0 and 
        lag(all_uptrending_emas,1) over (order by date) = 1
        then 'start not uptrend'
 when all_uptrending_emas = 0 and 
        lead(all_uptrending_emas,1) over (order by date) = 0
        then 'in not uptrend'
 when all_uptrending_emas = 0 and 
        lead(all_uptrending_emas,1) over (order by date) = 1
        then 'end not uptrend'
 when isnull(lead(all_uptrending_emas,1) over (order by date),1) = 1
        then 'end of time series'
end trending_status
,
case
 when date in (@start_date, @second_date) then NULL
 when all_uptrending_emas_w_lag1_close = 1 and 
        lag(all_uptrending_emas_w_lag1_close,1) over (order by date) = 0
        then 'start uptrend'
 when all_uptrending_emas_w_lag1_close = 1 and 
        lead(all_uptrending_emas_w_lag1_close,1) over (order by date) = 1
        then 'in uptrend'
 when all_uptrending_emas_w_lag1_close = 1 and 
        lead(all_uptrending_emas_w_lag1_close,1) over (order by date) = 0
        then 'end uptrend'
 when all_uptrending_emas_w_lag1_close = 0 and 
        lag(all_uptrending_emas_w_lag1_close,1) over (order by date) = 1
        then 'start not uptrend'
 when all_uptrending_emas_w_lag1_close = 0 and 
        lead(all_uptrending_emas_w_lag1_close,1) over (order by date) = 0
        then 'in not uptrend'
 when all_uptrending_emas_w_lag1_close = 0 and 
        lead(all_uptrending_emas_w_lag1_close,1) over (order by date) = 1
        then 'end not uptrend'
 when isnull(lead(all_uptrending_emas_w_lag1_close,1) over (order by date),1) = 1
        then 'end of time series'
end trending_status_w_lag1_close

into #close_emas_trending_status_by_date_symbol
from #for_@symbol_tests
order by date
-- step 4
-- populate #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status with
-- the basic time series data

-- with each symbol having a succession of trading date rows
-- with open, high, low, close price values and 
-- shares exchanged (volume) for each trading date

-- with left join to close_emas_trending_status_by_date_symbol for emas
-- and all_uptrending_emas plus trending status values

begin try
  drop table 
  #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status
end try
begin catch
  print '#yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status
        not available to drop'
end catch

select 
[dbo].yahoo_prices_valid_vols_only.symbol
,[dbo].yahoo_prices_valid_vols_only.date
,[dbo].yahoo_prices_valid_vols_only.[open]
,[dbo].yahoo_prices_valid_vols_only.high
,[dbo].yahoo_prices_valid_vols_only.low
,[dbo].yahoo_prices_valid_vols_only.[close]
,[dbo].yahoo_prices_valid_vols_only.volume
,#close_emas_trending_status_by_date_symbol.ema_10
,#close_emas_trending_status_by_date_symbol.ema_30
,#close_emas_trending_status_by_date_symbol.all_uptrending_emas
,#close_emas_trending_status_by_date_symbol.all_uptrending_emas_w_lag1_close
,#close_emas_trending_status_by_date_symbol.trending_status
,#close_emas_trending_status_by_date_symbol.trending_status_w_lag1_close

into #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status
from   [dbo].[yahoo_prices_valid_vols_only]
left join #close_emas_trending_status_by_date_symbol
on yahoo_prices_valid_vols_only.symbol = 
#close_emas_trending_status_by_date_symbol.symbol
and yahoo_prices_valid_vols_only.date = 
#close_emas_trending_status_by_date_symbol.date

where yahoo_prices_valid_vols_only.symbol = @symbol
order by yahoo_prices_valid_vols_only.date

The final operations for populating the stored_trades table are performed in step 5, whose code appears below.

  • This step accepts its input from the #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status table.
  • In addition, four case statements create and populate the buy and sell dates for each model.
    • The buy and sell prices from the first AI model are named buy_price and sell_price.
    • The buy and sell prices from the second AI model are named buy_price_w_lag1_close and sell_price_w_lag1_close.
    • The buy and sell dates for each model corresponds to the date column value for rows with non-null values for buy_price, sell_price, buy_price_w_lag1_close and sell_price_w_lag1_close.
  • The results set from step 5 is added to the rows in stored_trades table via an Insert into…Select statement.
  • By successively running the code for steps 1 through 5 for each of the four declare statements at the top of the script for step 1, you can add rows for each symbol with buy and sell decisions for both of the models to the stored_trades table.
  • Alternatively, you can place the code for steps 1 through 5 in a while loop that populates the @symbol local variable with as many symbols as your application requires.
-- step 5
-- insert into [dbo].[stored_trades] table
-- buy_price and sell_price are based exclusively on 
-- trending status for the first AI model
-- buy_price_w_lag1_close and sell_price_w_lag1_close are based exclusively on 
-- trending_status_w_lag1_close for the second AI model

insert into [dbo].[stored_trades]
(
[symbol]
,[date]
,[open] 
,[high]
,[low]
,[close]
,[volume]
,[ema_10]
,[ema_30]
--,[ema_50]
--,[ema_200]
,[all_uptrending_emas]
,[all_uptrending_emas_w_lag1_close] 
,[trending_status]
,[trending_status_w_lag1_close]
,buy_price
,sell_price
,[buy_price_w_lag1_close]
,[sell_price_w_lag1_close]
)

select
*, 
case
  when trending_status = 'in uptrend' and
     lag(trending_status, 1) over (order by date) = 'start uptrend' and
     (lag(trending_status, 2) over (order by date) = 'end not uptrend' 
     or lag(trending_status, 2) over (order by date) is null)
  then [open]
end buy_price
,
case
  when trending_status = 'in not uptrend' and
     lag(trending_status, 1) over (order by date) = 'start not uptrend' and
     lag(trending_status, 2) over (order by date) = 'end uptrend' 
  then [open]
end sell_price
,
case
  when trending_status_w_lag1_close = 'in uptrend' and
     lag(trending_status_w_lag1_close, 1) over (order by date) = 'start uptrend' and
     (lag(trending_status_w_lag1_close, 2) over (order by date) = 'end not uptrend' 
     or lag(trending_status_w_lag1_close, 2) over (order by date) is null)
  then [open]
end buy_price_w_lag1_close

,
case
  when trending_status_w_lag1_close = 'in not uptrend' and
     lag(trending_status_w_lag1_close, 1) over (order by date) = 'start not uptrend' and
     lag(trending_status_w_lag1_close, 2) over (order by date) = 'end uptrend' 
  then [open]
end sell_price_w_lag1_close

from #yahoo_prices_valid_vols_only_joined_with_emas_and_trending_status
order by date

Summary results sets from the stored_trades table

The predecessor tip to this one described and programmed a single AI model. The analysis of the decisions made by the single model confirmed that it is possible to devise a model that projects when to buy and sell stocks so that the average sell price is greater than the buy price.

However, there are usually multiple different ways to specify a model for making decisions. Another worthy question to ask and answer is which of two different AI models yields better decisions. This section of the tip analyzes the data in the stored_trades table to show how to assess for a sample of buy and sell decisions which AI model was most successful at finding sell prices that exceeded buy prices.

You can display a select set of column values for all the rows in the stored_trades table with the following simple script. These columns bear directly or indirectly on the buy-sell decision rules for the first and second AI models.

select
   symbol
  ,date
  ,[open]
  ,[close]
  ,ema_10
  ,ema_30
  ,trending_status
  ,trending_status_w_lag1_close
  ,buy_price
  ,sell_price
  ,buy_price_w_lag1_close
  ,sell_price_w_lag1_close
from [for_csv_from_python].[dbo].[stored_trades]

The following screen shot shows an excerpt from the results set from the preceding script with some markup to highlight specific features of the log in the stored_trades table.

  • The excerpt is for the first thirty-eight rows from the table. It depicts data for dates from 2009-01-02 through 2009-02-26.
  • There are two pair of highlighted yellow lines for the rows in the excerpt.
    • The first highlighted row is for 2009-01-07. This row is for the buy date from the second AI model. The model uses the open price on the buy date as its buy price, which is 4.7157 in this case. The buy price for the second AI model appears in the buy_price_w_lag1_close column.
    • The second highlighted row is for 2009-01-14. This is the first sell date for the second AI model. The model uses the open price on the sell date as its sell price, which is 4.3629. The sell price for the second AI model appears in the sell_price_w_lag1_close column.
    • The buy and sell decisions are the results of the second AI model. For this initial pair of buy and sell decisions, the second model was not successful at choosing a sell date with a greater price than the buy date.
    • The second pair of highlighted rows are also for the second AI model.
      • The third highlighted row is the buy date (2009-01-29) for the second buy-sell decision. The price on the buy date is 5.0571.
      • The fourth highlighted row is for sell date and price for the second decision. In this case, the sell date is 2009-02-25, and the sell price is 5.2271.
      • The sell price is greater than the buy price. In other words, the model found a pair of buy and sell dates so that the stock transaction generated a profit.
  • The first model shows a buy price, but there is no sell price in the screen shot below. This is because the sell price occurs on a later date than the last date in the screen shot below.
    • The buy date for the first AI model is the same buy date as for the second AI model. This initial buy date is the same because the first model is a subset of the second model. The first model picks a buy date based on the first period from the start or a prior sell date or the start of the time series data where the ten-period exponential moving average (ema_10) is greater than the thirty-period exponential moving average (ema_30). Recall that the second AI model adds a requirement for a buy date that the preceding close price is also greater than ema_10 in the current row. This additional requirement does not hold for the first AI model.
    • The first buy date and buy price from the first AI model are, respectively, 2009-01-07 and 4.7157. The buy price for the first AI model appears in the buy_price column.
Compare_AI_Models_fig1

The next screen shot shows another excerpt from the results set for the preceding script with the first sell date and price for the first decision from the first AI model.

  • The sell price appears in the sell_price column.
  • The sell date and price are, respectively, 2009-05-08 and 5.7171. The buy price for the first buy-sell decision from the first AI model appears in row 4 of the preceding screen shot.
  • Because the sell price of 5.7171 is greater than the buy price of 4.7157, the first AI model returned a profitable trade in its first transaction.
Compare_AI_Models_fig2

You can collect buy and sell prices for any collection of trades for each symbol being tracked. The analyses examine results for four symbols: BKNG, NBIX, NFLX, and ULTA. You can also choose any subset or all the trades for any symbol with each model separately. An Excel workbook was populated with excerpted values for some early trades from the preceding results set. In addition, analyses were performed on the values copied to individual workbook tabs for each of the four symbols tracked in this tip.

The following screen shot shows the workbook tab for the NFLX symbol.

  • The values in rows 1 through 22 are for the first AI model.
  • Each symbol presents results for five trades with each model per symbol.
    • Rows 1 and 24 are the header rows for a following set of rows as containing values, respectively, from the first and second AI models.
    • The first row for each trade from a model, such as row 3, is for the buy decision in the first trade from the first AI model.
    • The second row for each trade, such as row 4, is for the sell decision in the first trade.
    • The first five Column O (change per share) values show the change between buy and sell prices for a single share in each of the five trades for a symbol from a model.
    • The first five Column Q (change per $1000 of shares) values shows the change in value for as many shares (including fractional shares) as can be purchased with $1000 for each trade.
    • The value in cell Q22 shows the change in value for $1000 worth of shares across all five trades for the first AI model.
    • The values in rows 26 through 45 show corresponding results for the second AI model.
    • The total change across all five trades of $1000 worth of shares was $1,267.95 for the first AI model. On the other hand, the total change across all five trades of $1000 worth of shares with the second AI model was -$189.50. As you can see, the first AI model returns more favorable trade outcomes than those from the second AI model for this set of trades.
    • The tabs labelled BKNG, NBIX, and ULTA present comparable copied and computed values for the remaining three symbols. The workbook file with these and other results are available in the download for this tip.
Compare_AI_Models_fig3

Results from the Summary tab displays individual symbol and aggregate results across all four symbols for both AI models. The values from this tab appear below.

  • The results from the first AI model with a header of emas_10_30 show the summary results for $1000 worth of shares for each symbol separately as well as the sum across all four symbols.
  • The results from the second AI model with a header of emas_10_30_w_lag1_close display comparable results for each individual symbol as well as across all symbols.
  • Cell E7 shows that the total return from the first AI model was over 300 percent greater than for the second model.
Compare_AI_Models_fig4

This tip presents a small sample of the total results that could be provided. The framework demonstrated in this tip can be applied to all, or any subset of, trades from any set of symbols.

Next Steps

There are at least four next steps for this tip.

  1. Review the model descriptions and the code to make sure you understand and can apply the models.
  2. Run the models with the sample data illustrated in the tip to confirm you can duplicate the results.
  3. Try running the models with some additional symbols beyond those presented in the tip to confirm your understanding of the data management steps as well as the robustness of the model outcomes across a wider set of symbols than those reported on in the tip. Also, assess which model returns the most profitable trades for a fresh sample of symbols.
  4. Adapt the models' code to your organization's time series data and tweak the model design to achieve the goals important to your organization.

The download available with this tip includes four files to help you perform the next steps described above. Here's a brief description of each file in the download.

  • "comparing_programmed_AI_models.sql" contains all the T-SQL code presented in this tip.
  • ‘'underlying_data_for_programming_artificial_intelligence.csv" contains underlying time series data for the four symbols used to demonstrate the application of the model.
  • "bonus_data_for_programming_artificial_intelligence.csv" contains time series data for four additional symbols besides those explicitly presented and discussed.
  • "programming_AI_models_with_SQL_10ema_vs_30ema_new.xlsx" is an Excel workbook file. You can use it as an example to layout your own results based on the time series data for the four bonus symbols as well as your organization's data warehouses and/or operational data sources.


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Rick Dobson Rick Dobson is an author and an individual trader. He is also a SQL Server professional with decades of T-SQL experience that includes authoring books, running a national seminar practice, working for businesses on finance and healthcare development projects, and serving as a regular contributor to MSSQLTips.com. He has been growing his Python skills for more than the past half decade -- especially for data visualization and ETL tasks with JSON and CSV files. His most recent professional passions include financial time series data and analyses, AI models, and statistics. He believes the proper application of these skills can help traders and investors to make more profitable decisions.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article

















get free sql tips
agree to terms