April 27, 2025 at 11:28 pm
Hi everyone,
Our company receives data sets from over 100 clients on a monthly basis. A single file may contain as few as 100 records, while some data sets are split across up to 5 files containing up to 15 million records in total. The client may send few types of files: Inforce, Premium, Claims, and Stamp Duty.
We are planning to re-engineer our data architecture and would like advice on the best way to:
Ingest these files,
Validate the data,
Transform it appropriately, and
Make it available for reporting in Power BI.
We prefer to use the Azure platform for this project.
What is you recommend for a scalable, maintainable solution?
Thanks in advance for your help!
April 28, 2025 at 12:35 am
Never heard of any of those file types (are they?), but generally speaking, the pattern for ingesting data into PowerBI is
1. filter for a specific file type.
2. use PowerQuery to transform the data so it has the same shape and field types as the destination table(s)
3. Import
If you create a pipeline in Fabric, it should solve your problem.
April 28, 2025 at 1:57 am
Thanks for your reply. I'll look into Fabric.
The data is coming as csv or excel formats. But I meant to say there are different datasets (eg: premium data, claims data and stamp duty).
April 29, 2025 at 3:20 am
CSV and Excel you can pretty much use the Import Data stuff... just fires up SSIS and runs the wizard. You can also use something like BULK INSERT to do it. If the data is clean, BULK INSERT can import tons of data really fast.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply