I ran across an article on the best AI meeting assistants, which can help you with note-taking and summarization tasks. Most of these cost money, but there was something that struck me about these tools.
They aren't allowed for a lot of my meetings.
At Redgate, I'm not sure if these are allowed. Our policy is to seek permission before using a tool, so I've sent a query off for more information. In many of my customer meetings or in Microsoft meetings, I've been told these tools aren't allowed as there isn't sufficient data security to allow their usage.
I think a lot of people are concerned about their data when these AI LLMs (large language models) are involved and if their data is being used to train future models. If it is, then that could be a security problem. Already many people and organizations are upset about their data sets being used for training without their knowledge. Not everyone agrees.
I don't quite know how I feel about this issue, but I do think there ought to be guidelines and discussions about how data is used, and where data is stored (or where it transits a network). Certainly, organizations might need to set this in place to ensure their employees aren't exposing confidential or internal data in ways that could increase their risk of a breach. After all, without any guidelines or rules, you may have no recourse if employees just use tools and data escapes your network.
At Redgate, we've begun experimenting with SQL Prompt and AI. As of now, we've had quite a few queries about how customer data is being used. Our EAP EULA notes that schema information will leave your network, but not data in rows. That might be a problem or might not, but it's important to disclose that, which we've done.
I'd like to think that as hardware power increases and models become more mature, we'll stop having to send data for many AI-related tasks away from a device that might perform a task. At some point, I hope your mobile phone or laptop can process the AI model without a network connection, which would be helpful for data security.
AI is here to stay, and we have to learn how to work with it, and where it works well. It feels a little like we haven't really answered many of these questions as an industry or inside our organizations. That's something that needs to change and quickly as the number of places AI is being used continues to rapidly increase.