hi , i hear more and more that we have too many connections to our servers. many are sql. many arent.
so we are told to use api's where possible.
Isnt there a connection behind the scenes when an api is asked to return data from whatever the erp's database is>
February 20, 2026 at 4:50 pm
Yes there is. But perhaps the desire is to remove direct database connections.
February 20, 2026 at 5:58 pm
thx phil, are direct worse?
thx phil, are direct worse?
it depends.
first one could cause issues with api calls as well - but api server can throttle calls if needed even if the caller was irresponsible enough not to implement it themselves.
and then there is the notion of what constitute too many calls. 10k connections per minute or per hour or per day?
February 20, 2026 at 7:10 pm
APIs usually pool the connections; I think this happens automagically. So it's not one connection for one user but many connections shared by all users. Like most things that happen automagically, sometimes they don't do what you expect and you have to do some tuning. So if the app isn't pooling connections or is pooling them in an inefficient way, you could end up with "too many connections".
February 20, 2026 at 8:24 pm
Another question for you is what problem are you trying to solve and how will you know it is solved? "Too many connections" is very arbitrary and not helpful. BUT "Too many connections causing measurable performance issues" is helpful. Get the measurements, build a test environment and do some testing to see if the API helps OR if those performance issues were caused by something else. MAYBE it makes sense to have a dedicated server for the SQL instance since you say many of the connections aren't SQL related.
Fixing something because someone read a blog once that direct connections to SQL are bad and you should use API's instead MAY not be a good scenario for your environment and/or use case. We do direct database connections for most in-house apps and things work fine and the only time I notice issues is with a 3rd party connector (ie an API) being used to allow cloud systems to talk to the database. So in my specific scenario, the API is causing more problems but thankfully the software using the API seems to handle things well and the rollbacks are not a big deal - they are simple SELECT queries that get deadlocked and rolled back.
Before making drastic changes to your environment (implementing API server(s) to handle connections to SQL), I'd first make sure that there is an actual bottleneck that the API's will help solve.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
February 20, 2026 at 9:40 pm
Exactly! Like my dad always said "If it aint broke, don't fix it". In SQL Server, the corollary is "if you don't know your change is going to make it better, don't deploy it to production."
February 20, 2026 at 9:45 pm
I've had a few customers do this and they just move the load. Unless the API server caches results for its connections, you still have a lot of queries to the db.
In fact, sometimes you get more because people make more API calls thinking they are free.
February 23, 2026 at 7:42 pm
Another thing - like Steve Jones said - having the API server just moves the load. If the API server caches things, you run the risk that the API server has stale data being sent back when requested. This may be acceptable or might not be.
Depending on the workload, there are multiple solutions.
For example, if you have a read heavy workload, having multiple SQL instances set up with replication may be a better solution or having a reporting solution that can cache the data on a schedule.
If it is a write heavy workload AND you need eventual consistency in the database not immediate consistency, having an API server that can cache the writes to the database may be beneficial.
LOTS of solutions, just need to know what problem you are trying to fix.
Another thing to think about with an API server is it is another cog in your solution. If an application suddenly starts performing slowly, it could be the app, the workstation running the app, the API server, the database server, any network connection between the 3+ systems, load on any of the 3+ systems, and so on. Fewer cogs mean faster troubleshooting and less blame.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
February 23, 2026 at 8:03 pm
thx all, i'm going to try to digest all of this. And post back here. This post has already opened my eyes to a possible benefit of api's in that they offer a layer of control, not unlike a governor. i've always considered them a nuisance as in netsuite where they enforce a 4000 (or something like that) record limit that everybody tries to circumvent. maybe they are a good thing. and they motivate us to create hub or warehouse layers away from the erp to sort of spread the load.
February 23, 2026 at 9:44 pm
Stan, your thinking about the hub layer is the right direction and honestly what we see work best in practice. Instead of having a dozen internal apps each making their own API calls to NetSuite (and fighting those record limits you mentioned), you sync the ERP data down to a local SQL Server instance and let everything query that. Your apps get fast local reads with regular T SQL, NetSuite's API gets way less traffic, and you don't have connection sprawl across your SQL boxes from every app doing its own thing.
The part that gets messy is keeping that hub fresh. If you use SSIS packages on a schedule you'll always have some staleness, and maintaining those packages longterm is a chore nobody signs up for. If you need changes to flow back to NetSuite too (say an order status update), now you're building a second pipeline in the opposite direction.
We deal with this exact pattern at Stacksync (https://docs.stacksync.com/two-way-sync/connectors/sql-server) where we sync ERPs like NetSuite into SQL Server bi directionally so the hub stays current in both directions without separate pipelines. But even if you build it yourself, the hub architecture is solid. Just make sure you think through the freshness requirements early because that drives whether you need real time CDC style sync or whether a scheduled pull every 15 minutes is good enough for your use case.
February 24, 2026 at 3:13 pm
thx dbruton, i have a feeling that when folks throw micro services at challenges like this to push rather than pull data, its completely possible that pooling isnt happening. i saw a proposal recently of a "canonical" proposal where a layer of data would be available across the corp for anything. not long after that the proposer started talking about a hub instead. It may sound crazy but i thing an unstructured hub could be a good surrogate for a warehouse.
February 24, 2026 at 3:47 pm
thx Ruben, luckily (i think) , our users would be more confused by near real time updates than not. we arent an airlines. i think you have to agree that a pipeline uni or bi, still has to deal with api and/or cdata like connection challenges at least once per day, on the target instance. We are global so we'd probably run our etl multiple times with different configs to deal with the different time zones/instances.
i'm not the edi guy here but i believe we have to run pipelines the other way to get data into netsuite. Dont know the volume but i believe its significant and often. i believe some of those ingestion api's are homegrown but either way i dont think a hub necessarily has a legitimate role in that direction.
whether its ssis (whose days i think are numbered) or fabric (which looks really viable to me) or something like boomi or custom OO pipelines etc etc, i wish more of us considered CDC like solutions, which i think (i've asked in forums but never got an answer) dont create backward pressure on erp's. But i am only familiar with cdc like data coming out of sql instances/logs, not things like netsuite, ibm etc etc. and i did hear from one infrastructure guy (not here) once that so much log info was pouring out in one of its logs, they couldnt keep up with the data for use in other ways.
February 24, 2026 at 4:10 pm
thx dbruton, i have a feeling that when folks throw micro services at challenges like this to push rather than pull data, its completely possible that pooling isnt happening. i saw a proposal recently of a "canonical" proposal where a layer of data would be available across the corp for anything. not long after that the proposer started talking about a hub instead. It may sound crazy but i thing an unstructured hub could be a good surrogate for a warehouse.
I've had customers make this a surrogate and then you start to get lots of small, random requests, and you overwhelm the db. This is the danger of an API server, as it can become a place people think queries are less costly than db queries. They are slightly, but the volumes usually cause issues.
February 24, 2026 at 9:09 pm
thx steve, but such a surrogate would then federate/distribute the data kind of like a spider web to consuming/subscriber systems that register their needs. i think the old term was hub and spoke. that way nobody is coming to the well uninvited. thoughts? other terms i've heard include "pub/sub".
Viewing 15 posts - 1 through 15 (of 15 total)
You must be logged in to reply to this topic. Login to reply