Perhaps, as in genetics with the recent CRISPR scandal, new ventures with AI / machine learning (that require massive large datasets that can't be easily manufactured), should be regulated and prohibited. No passwords and unsecured production data in non-production contexts calls out underlying problems. In my experience, administration of non-production environments can be more challenging, but often goes under the radar.
Maybe due diligence on the part of venture capitalists investing in 'big data' projects should include 'privacy provisions', just as any new oil drilling venture would require environmental impact proofs. As the cliche goes, 'data is the new oil'.
This approach might even bolster the role of data professionals in an era where PaaS and IaaS make it very easy to 'get started'. I am sure many of us have seen a serviceable prototype rushed into production to begin the exploitation, before the solution is stripped of its mirrors and chicken wire. Startups by their nature are going to be pushing aggressive schedules potentially without the protective compliance infrastructure established concerns take for granted.
Evisoft might crash and burn losing investor cash, but what about the owners of the data that has been compromised? The big stick of GDPR can dole out very large fines, but it doesn't deal with the root causes. AI / machine learning is perhaps the gold rush of the moment and we can do without a wild west approach to all our data that we know is out there (and especially the data we don't).