At this point I get about 1-2 emails a year telling me some company has exposed my private data in some way. It’s completely routine.
We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change. The current cost of “here’s a years worth of credit monitoring” doesn’t even amount to a slap on the wrist.
rolandog1 day ago
And tied to inflation (or to a % of gross income), too, otherwise it'll be cheaper in X years to get fined than to hire information security officers
overfeed1 day ago
> We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change.
That won't change a single thing, except for shell-company shenanigans, more frequent bankruptcy proceedings, and the same people coming back trading under a new name and logo. A law sending people to prison may actually change things.
troad1 day ago
"Oh you want to make a little start up to share recipes between friends or whatever? Aww, that's cute. Well, here's the OAuth spec and an incomplete list of footguns. I hope your grasp of elliptic curves is strong. Prison time if you fail."
The absolutely only consequence of laws that criminalise mistakes in handling of PII is to force everyone to externalise auth to the likes of Auth0. And you can bet your ass that if this ever happens, the likes of Auth0 will lobby like hell to never ever repeal or update those laws, being a vast corrupt funnel of business to them.
Congrats, you've created a new Inuit.
JeremyStinson1 day ago
All those people have high-priced lawyers that will keep them out of prison. The DBA and the Data Engineer will be the ones who go to jail for "Not ensuring all applicable data security controls were configured, and enabled, to prevent the detection, collection, and modification of any and all data assets within the purview of Company X, all its holdings and subsidiaries."
lucyjojo1 day ago
force nationalization of the business for egregious cases.
muyuu19 hours ago
the main reason for this recent change is that before they used to just not report it, it makes no financial sense to them and they only do it because of recent legislation and liability
it's the only decent development from those data protection laws that usually do anything but protect data, but credit where it's due
Looks like CyberNews have edited the article with more info since first I saw it, it used to look quite suspicious and untrustworthy, it now has more info. Still doesn't say exactly what a record is, or how many uniques there are.
frereubu1 day ago
I presume the database exists, but some of the details don't add up. IDMerit say "IDMERIT’s systems and security infrastructure have never been compromised", "there has never been a data breach or exfiltration from [our partners'] systems during, before, or after this event" and "IDMerit does not own, control or store customer data". But Cybernews says that they "promptly secured the database" after being notified. Cybernews also didn't give the reason why they thought this was to do with IDMerit (unless I missed it). I can't quite make head nor tail of it.
0xbadcafebee1 day ago
To sum up the updates in the article
- IDMerit asked the security researcher for proof, the researcher asked for money first, so IDMerit balked
- IDMerit basically says they have no proof they were hacked, so they weren't
- The researcher is a freelancer... for CyberNews...
Even if somebody followed up with IDMerit, it's likely they will say they are not affected. The security researcher is probably the only person who could prove whether they were or not vulnerable, at this point. If they don't come forward, we can only assume they weren't vulnerable, but we don't know. This is a good lesson for responsible disclosure in the future.
...also, this is yet another example of why we need a regulated Software Building Code, with penalties for not conforming to it. If somebody is found to be hosting a public Mongo instance with no authentication, it should be reported to a state or federal agency, so that real penalties can be applied, the way they are for other code violations. And they shouldn't have been allowed to launch with that in the first place. It shouldn't be up to random "security researchers" to police businesses.
tootie1 day ago
It's a weird article. For one, the researcher says "they believe" the data belongs to IDMerit but apparently aren't sure. IDMerit denies it's the owner of the data nor is it any of their partners. And there's very few details about where or how they found this database. It's possibly some kind of hoax or ransom attempt? Or there's really just billions of unaccounted databases of private data just sitting all over the Internet.
uean1 day ago
The cybernews article does have some screenshots showing names like “idmb2c” … also that IDMerit was contacted in November and the ports were closed a day later.
neya1 day ago
If I was in Vegas, I would bet my life savings that the CXOs of the said ID Verification company's data isn't included in the leak. This is just like that Mc Donald's CEO's video - they never use what they create.
submerge1 day ago
I bet their data is included too, for two reasons:
First, identity verification data for KYC is a little bit different from fast food or social media in that it's very difficult to live a normal life without being subject to any KYC checks. (I'm sure someone will chime in that they get paid in bitcoin and buy their groceries with cash.) If you are applying for some financial product or service that requires KYC, and they can't find any information about you, you will often either be denied that product or have to jump through a bunch of additional hoops to prove who you are. So it benefits CXOs to have their data included in these datasets, in fact if they are well paid they may well have more activity requiring KYC checks than the average person.
Second, and much more simply, one's own data often makes for a good test case since you know its accuracy.
neya1 day ago
I am not debating that they don't need KYC, I'm simply saying they probably use a more secure alternative than their own.
ezst1 day ago
Or the tech executives barring their children from using social media.
neya1 day ago
Absolutely!
egorfine1 day ago
KYC = Kill Your Customer.
kindkang20241 day ago
[dead]
whatsupdog1 day ago
Where the F does IDMerit even get all this data from? They have names, DOBs, addressed, phone numbers, national identity numbers for over a billion people? How?
wongarsu1 day ago
The 1B number would contain multiple records per person.
For example if I (as a German in Germany, ymmv) open a bank account online that involves a call with one of these companies where they take pictures and information from my passport and check that that's me. Then I choose payment in installments on some online shop, same game. Apply for a small loan? Same game. Set up an account for trading (stock exchange or crypto)? You guessed it, another call. Another payment in installments, backed by the same bank? Apparently verifying my identity again is easier than checking their database. Each of those is another record. Potentially with a new identity document, address or even name (maybe you got married) but mostly just the same data confirmed again with another timestamp
Not all of them use the same identity verification service, but there aren't that many. And I wouldn't be surprised to learn that many are the same company under different brands
shakna1 day ago
A record is not necessarily unique. Name changes, address changes, phone number changes, can all create "new" records in dumps like these.
uean1 day ago
Makes sense if the ID verification process involves scanning a driver license or passport.
> We requested a security incident report from the ethical hackers as proof
So instead of paying him a fair bug bounty, they demand that he write a formal report for them and prove to them that there is even a problem.
Totally unhinged, but it gets worse:
> the response was a demand for money for the report, which confirmed our suspicion that this was a ransom-related incident.
Wow. So when the security researcher informs them that he would be happy to do some consulting work for them and informs them of his rates, they flip out and accuse his initial good samaritan decision to inform the company of the issue of being part of a plot by him to hold the company for ransom?
Whoever thought this is both totally delusional and a complete jerk. Truly, no good deed goes unpunished.
> We own and operate our proprietary platform, but we do not own, control or store customer data or the underlying data maintained by independent data sources.
This seems like a critical sentence. Is this database actually operated by IDMerit, or someone else? If so, who?
ericwebb1 day ago
Remember when you'd get a letter in the mail, "you identity has been compromised, here is a subscription to an identity monitoring service."
The system is broken. We shouldn't be so vulnerable because of foundational infrastructure.
pirate7871 day ago
While this leak may or may not have happened, for this type of exposure there should be criminal liability for developers and executives. Criminal negligence and prison time.
outime1 day ago
If developers are going to face criminal liability, they should IMHO also have legal ways to push back against certain implementations without risking their jobs, or at least have a way to leave a legal justification somewhere: "I'm doing this because I'm forced to but I disagree" which is then signed by management.
Until then, you're putting the weight of the law on the wrong side of the equation, since developers aren't the ones consciously making risky decisions.
danlitt1 day ago
Most countries already have whistleblower laws. If you are living somewhere that has any kind of "wrongful termination" legislation, an employer asking you to commit a crime is an open and shut case. I would guess that all of the USA and Europe would have existing sufficient protections, for example (although the US never ceases to surprise me).
activestore1 day ago
They won’t do it just for protection.
The state would need to offer an award, and maybe witness relocation
rmnclmnt1 day ago
Unrelated to the story but TIL AOL is still a thing in 2026!
xbar1 day ago
Seems like it deserves to be its own post.
kevincloudsec1 day ago
every age verification mandate creates another one of these databases. billion records, no password, plain text.
bilekas1 day ago
> That review identified no exposure, vulnerability or unauthorized access within the IDMERIT environment
The fact that they didn't vet their data providers then has to be considered a form of negligence. In the end, its the company I am handing over my details to to act responsibly, not their providers.
I hate this responsibility delegating when its not a good luck, and this will continue to get worse now as the entire internet will be ID gated soon. But don't worry, all the lapse in privacy and even security in the name of 'saving the kids'.
djohnston1 day ago
aol.com!?!?
jajuuka1 day ago
Unprotected MongoDB, tables without password, data in plain text. It's a textbook example of doing absolutely everything wrong.
plagiarist1 day ago
Yet another point of proof that the US needs a HIPAA covering PII.
stopbulying1 day ago
Do such breaches make it trivial to lie to age and identity verification systems?
mbix771 day ago
What did measures like gdpr ever achieve except for making me click a cookie prompt away.
Rygian1 day ago
Actual punitive measures taken against entities who e.g. manipulate personal data in a negligent way. [1]
Right to be forgotten - you can ask companies to delete data they hold on you.
Data ownership/portability : you can ask companies for a copy of all data they hold on you or related to you.
I’ve seen the latter used by job applicants to get an entire copy of their interviews, transcripts and assessments including the reason for not being hired.
saithir1 day ago
It's really a wonder how every time gdpr is even remotely related, there's always gotta be someone complaining about how gdpr is at fault for the cookie/data prompts, and never that sites and advertising companies (and their 2137 partners) are at fault for actually making those prompts as annoying as possible in hopes that you just agree.
etothepii1 day ago
In the UK open banking was essentially a response to GDPR this has allowed (to a limited extent) a variety of tools to be built on top of bank accounts that others would not have been.
It makes you aware a site is selling your data or is otherwise tracking you because otherwise they would not need a banner to request for consents to do so :)
throwaway2709251 day ago
Since people still seem to conflate the two, let me say it loud and clear:
GDPR HAS NOTHING TO DO WITH THE COOKIE PROMPTS!
pjc501 day ago
GDPR doesn't apply in the states, but hopefully it provides for some punishment for the poor security here for EU customers. Of course, then some Americans will get mad that a US company has to follow EU law.
bilekas1 day ago
> Of course, then some Americans will get mad that a US company has to follow EU law.
This is always the way of the world though, if you want to do business anywhere, you are of course obligated to follow the local laws and regulations. I don't see anyone disputing this outside of blatant patent infringement by certain countries.
ralferoo1 day ago
The GDPR applies worldwide to any data held about EU or UK citizens, regardless of where they reside. It does apply in the US, it's just potentially harder for the EU to enforce meaningful penalties for infractions.
majorchord1 day ago
> It does apply in the US
EU law does not apply to US citizens residing in the US with no ties to the EU.
ralferoo1 day ago
Correct. It does not apply to US citizens residing anywhere in the world. It does, however, as I said, apply to EU citizens regardless of where in the world they reside.
If a company holds data about EU citizens, the GDPR applies to them, regardless of where that company is based. Including the US. Hence the statement "It (GDPR) does apply in the US" is completely correct.
quesera1 day ago
It's written that way, perhaps.
But there's no jurisdictional reality that any of country/union A's rights will protect a person while they are present in country/union B.
In the same way that a US citizen does not have legal protection for free speech when present in, e.g. China, Saudi Arabia, or Germany.
Even if the EU got the text incorporated into the UN Universal Declaration of Human Rights, there are famously many countries who are not signatories (and it would require a locally-implemented actual law to support its recognition).
The EU can arrange post facto penalties for violations of their citizens' rights, to be (potentially) administered in the future, when a responsible entity enters EU jurisdiction, but absolutely not before then without cooperation by treaty with the nation where these foreign-and-not-real "rights" were violated. Which would be a surrender of sovereignty and basically unimaginable.
(No comment on the goodness or successfulness of the GDPR here, just that no part of it is relevant outside of the EU regardless of how the text is composed.)
(And this is all written with awareness that the US somehow manages to selectively enforce their laws extra-jurisdictionally in weak foreign nations. The EU is not the US, and the US is not weak.)
ralferoo23 hours ago
Just, that is why I wrote "it's just potentially harder for the EU to enforce meaningful penalties for infractions."
You premise is true in one sense, however, the point remains - the GDPR covers all EU citizens, regardless of where the company is based. For small US companies, sure the EU has very little power to enforce it, but larger companies that derive any revenue from the EU can be, and are, fined by the EU GDPR commissioners.
I can't find the source, but Google's AI in the search results also claims that "EU GDPR fines for U.S. companies are significant, with U.S. firms facing roughly 83% of total GDPR fines, totaling over €4.68 billion by early 2025". That 83% figure seems unreasonably high to me, but it's possibly just a consequence of the size of the fine being based on worldwide revenue and over half of the 20 biggest fines were to Google and Meta.
quesera21 hours ago
I get it, but the operative point is not "potentially harder", but "literally impossible" to enforce -- unless the corp has some presence in the EU of course.
FWIW, I just checked Wikipedia to sanity-check my memory of our lawyers' guidance. Important differences from our discussion, if my read is correct:
GDPR does not apply to "EU citizens anywhere in the world", it applies to the personal data of "living persons ... inside the EU" or with data processed there.
(So GDPR would apply to a US citizen who is present in the EU, and/or being a user/customer of a vendor that operates in the EU)
From the "Misconceptions" section[0]:
> ## Misconceptions
>
> GDPR applies to anyone processing personal data of EU citizens anywhere in the world
>
> In fact, it applies to non-EU established organizations only where they are processing data of data subjects located in the EU (irrespective of their citizenship) and then only when supplying goods or services to them, or monitoring their behaviour.
(So GDPR would not apply to a EU citizen who is present in the US at the time of "processing", whether that's a service or product sale, etc)
This is important to my company. We are US-based, but have EU citizens as customers. For regulatory reasons, we block customer activity from outside the US, and we are not able to comply with GDPR (but we do have to be aware of CCPA[1] which has some similarities).
I'm not sure I agree with that interpretation, as "processing" is extremely broad and includes just storing the data.
Yes, technically if the EU citizen remains outside the EU for the entire lifecycle of the data up to and including deletion, then it isn't covered. But if you store that data at all when they have returned to the EU, then you need to comply with the GDPR in terms of handling that data.
Also, as a UK citizen (formally EU citizen), I don't understand why US-based countries are so against the GDPR, as essentially it's just a codification of how to do the morally best thing for your customers. Any data you don't need for a business purpose should be deleted as soon as possible. You can have any data about someone as long as there is a justifiable business reason for it. You have to let someone know what data you have about them (if they request it via a SAR) and you have to give them the information up front to determine if they are happy with you handling their data, via a clear privacy policy and opt-in to having their data used.
Complying with the GDPR is pretty straight forward, as long as your intention isn't to profit by selling or otherwise making use of people's data in ways that they wouldn't be comfortable with. If you aren't doing anything bad with user's data and already following good security practices, including deleting data that's no longer needed, then you are already compliant with the intent of the GDPR and going from that to full compliance is probably only adding processes to be able to handle an SAR.
quesera19 hours ago
I'd have to dig deeper, but generally "storing" (or maintaining storage) is not "processing" in the local US legal vernacular. Where both apply (PCI DSS, etc), both terms are used.
In my personal (business) case, we literally cannot comply with GDPR and also BSA/AML, FinCen, Reg E, KYC, etc, simultaneously. Our "business requirements" can last 7+ years, and our customers' wishes have no bearing on them.
And while we have no operations in any EU country, we are absolutely not obligated to even consider any EU laws about the data belonging to any of our customers, regardless of their citizenship. That's the primary point I'm making here -- the EU has zero jurisdiction over anything that happens outside the EU, ever, or any entities outside the EU, despite any claims to the contrary (which, according to Wikipedia at least, are not even made).
This is intuitive, but also the very expensive legal opinion of our lawyers, who have offices in the US, EU, and EMEA, for whatever that's worth!
In the general case, and as a customer, I'm fully in support of GDPR and CCPA-like protections. They're a great idea, I think! I'm usually the privacy nut in any discussion.
But compliance is obviously more work/expense than not, and small companies are especially allergic to nonproductive work and expenses. So naturally there's resistance to the suggestion that a foreign law compels them to do more of both.
And of course, if we're talking about the US, we have a very different culture around government and regulation. "As little as possible" (except those that protect my interests) is the preference of the landed gentry, and those who would aspire to same.
Reasonable people will recognize this as absurd, but ... you can't spell "absurd" without U, S, and A.
ralferoo18 hours ago
Absolutely agree, effectively the EU can't touch small companies, they'd only be able to touch you if you generated any revenue in the EU that they could intercept.
As for "the EU has zero jurisdiction over anything that happens outside the EU, ever, or any entities outside the EU, despite any claims to the contrary" again, this isn't true if you conduct any business in the EU. Even for a company domiciled outside the EU, they could compel your payment processors to seize all payments to you from EU entities, for instance. The degree to which they'd fight pushback would depend on how serious the violation was and the size of the company, but you can be sure for instance that even if say Google had no EU presence at all, that the EU would make sure they complied with the GDPR or else ban them from the EU entirely.
In your situation, you actually have a good defence if there ever was an EU citizen trying to use the GDPR against you, because by taking efforts to not allow sales outside the US, you can argue that your services were never for sale to EU citizens. I guess if your TOS also said your service wasn't available to EU citizens, it would be even more watertight.
But just FYI for this:
> In my personal (business) case, we literally cannot comply with GDPR and also BSA/AML, FinCen, Reg E, KYC, etc, simultaneously. Our "business requirements" can last 7+ years, and our customers' wishes have no bearing on them.
I don't know the requirements of all those other things, but contrary to what a lot of people believe - you CAN store whatever you need to if there is a justifiable business reason for it, and you don't have to delete it even if a customer requests you to if there is a valid business requirement, such as regulatory or statutory compliance. The GDPR just compels you to be transparent with the data subject about what data is stored in those cases.
Almost every business in the EU is required to hold tax records of sales for 5 years, and so obviously these must be retained even after a customer stops being a customer and even if they request deletion of their PII data. What the GDPR requires is that you only keep the minimum required data to fulfill those statutory requirements, and delete anything else, and also not to use that data for any other purpose. Regular data should be deleted as soon as it is no longer needed.
I haven't studied the CCPA in depth, but my understanding is that it's very similar in scope to the GDPR and that complying with one would get you almost all the way to compliance with the other.
I also understand the general reaction against being told what to do by some other extra-territorial entity, but in today's society of cross-border trade, it's usually inescapable, apart from when they directly contradict - e.g. requirements to only store data in one territory.
NaN years ago
undefined
esperent1 day ago
This is actually a Fox News article and as far as I can see it's not corroborated anywhere.
I saw a reddit thread about it earlier where someone said the apparent hacker refused to actually show any of the data and was asking for money. So probably just a scam rather than a real leak.
mapontosevenths1 day ago
The Fox article just cites CyberNews.[0]
Cybernews posts screenshots[1] featuring usernames like idmKYCCN and idmKYCFR, and the ports were locked down after contacting ID Merit.
I think thay what's happened is that everyone is telling the literal truth and speaking very carefully to use that truth to obscure rather than inform. To hell with the victims. The way I intrerpet this is that their denials are both factually accurate AND misleading.
The partner who said there is "no indication that any customer data has been compromised" is telling the literal truth. They can't find any indicators because they stink at logging and the screenshots posted on CyberNews obscure the customer info intentionally. Instead Cyber News only shows the IDM usernames in plaintext. Which was the responsible thing to do They literally cant see any indications... of customer data... because they dont have logs.
It should also be noted that the Partners customer in this case is likely ID Merit... not the people whose information was stolen. So again, their statement was literally true even if they do find evidence of a billion records being leaked.
Nobody should ever trust anyone involved in this again if I'm correct in this interpretation of the available facts.
At this point I get about 1-2 emails a year telling me some company has exposed my private data in some way. It’s completely routine.
We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change. The current cost of “here’s a years worth of credit monitoring” doesn’t even amount to a slap on the wrist.
And tied to inflation (or to a % of gross income), too, otherwise it'll be cheaper in X years to get fined than to hire information security officers
> We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change.
That won't change a single thing, except for shell-company shenanigans, more frequent bankruptcy proceedings, and the same people coming back trading under a new name and logo. A law sending people to prison may actually change things.
"Oh you want to make a little start up to share recipes between friends or whatever? Aww, that's cute. Well, here's the OAuth spec and an incomplete list of footguns. I hope your grasp of elliptic curves is strong. Prison time if you fail."
The absolutely only consequence of laws that criminalise mistakes in handling of PII is to force everyone to externalise auth to the likes of Auth0. And you can bet your ass that if this ever happens, the likes of Auth0 will lobby like hell to never ever repeal or update those laws, being a vast corrupt funnel of business to them.
Congrats, you've created a new Inuit.
All those people have high-priced lawyers that will keep them out of prison. The DBA and the Data Engineer will be the ones who go to jail for "Not ensuring all applicable data security controls were configured, and enabled, to prevent the detection, collection, and modification of any and all data assets within the purview of Company X, all its holdings and subsidiaries."
force nationalization of the business for egregious cases.
the main reason for this recent change is that before they used to just not report it, it makes no financial sense to them and they only do it because of recent legislation and liability
it's the only decent development from those data protection laws that usually do anything but protect data, but credit where it's due
Almost a month old, original source: https://cybernews.com/security/global-data-leak-exposes-bill...
and I've never seen any confirmation elsewhere
Looks like CyberNews have edited the article with more info since first I saw it, it used to look quite suspicious and untrustworthy, it now has more info. Still doesn't say exactly what a record is, or how many uniques there are.
I presume the database exists, but some of the details don't add up. IDMerit say "IDMERIT’s systems and security infrastructure have never been compromised", "there has never been a data breach or exfiltration from [our partners'] systems during, before, or after this event" and "IDMerit does not own, control or store customer data". But Cybernews says that they "promptly secured the database" after being notified. Cybernews also didn't give the reason why they thought this was to do with IDMerit (unless I missed it). I can't quite make head nor tail of it.
To sum up the updates in the article
Even if somebody followed up with IDMerit, it's likely they will say they are not affected. The security researcher is probably the only person who could prove whether they were or not vulnerable, at this point. If they don't come forward, we can only assume they weren't vulnerable, but we don't know. This is a good lesson for responsible disclosure in the future....also, this is yet another example of why we need a regulated Software Building Code, with penalties for not conforming to it. If somebody is found to be hosting a public Mongo instance with no authentication, it should be reported to a state or federal agency, so that real penalties can be applied, the way they are for other code violations. And they shouldn't have been allowed to launch with that in the first place. It shouldn't be up to random "security researchers" to police businesses.
It's a weird article. For one, the researcher says "they believe" the data belongs to IDMerit but apparently aren't sure. IDMerit denies it's the owner of the data nor is it any of their partners. And there's very few details about where or how they found this database. It's possibly some kind of hoax or ransom attempt? Or there's really just billions of unaccounted databases of private data just sitting all over the Internet.
The cybernews article does have some screenshots showing names like “idmb2c” … also that IDMerit was contacted in November and the ports were closed a day later.
If I was in Vegas, I would bet my life savings that the CXOs of the said ID Verification company's data isn't included in the leak. This is just like that Mc Donald's CEO's video - they never use what they create.
I bet their data is included too, for two reasons:
First, identity verification data for KYC is a little bit different from fast food or social media in that it's very difficult to live a normal life without being subject to any KYC checks. (I'm sure someone will chime in that they get paid in bitcoin and buy their groceries with cash.) If you are applying for some financial product or service that requires KYC, and they can't find any information about you, you will often either be denied that product or have to jump through a bunch of additional hoops to prove who you are. So it benefits CXOs to have their data included in these datasets, in fact if they are well paid they may well have more activity requiring KYC checks than the average person.
Second, and much more simply, one's own data often makes for a good test case since you know its accuracy.
I am not debating that they don't need KYC, I'm simply saying they probably use a more secure alternative than their own.
Or the tech executives barring their children from using social media.
Absolutely!
KYC = Kill Your Customer.
[dead]
Where the F does IDMerit even get all this data from? They have names, DOBs, addressed, phone numbers, national identity numbers for over a billion people? How?
The 1B number would contain multiple records per person.
For example if I (as a German in Germany, ymmv) open a bank account online that involves a call with one of these companies where they take pictures and information from my passport and check that that's me. Then I choose payment in installments on some online shop, same game. Apply for a small loan? Same game. Set up an account for trading (stock exchange or crypto)? You guessed it, another call. Another payment in installments, backed by the same bank? Apparently verifying my identity again is easier than checking their database. Each of those is another record. Potentially with a new identity document, address or even name (maybe you got married) but mostly just the same data confirmed again with another timestamp
Not all of them use the same identity verification service, but there aren't that many. And I wouldn't be surprised to learn that many are the same company under different brands
A record is not necessarily unique. Name changes, address changes, phone number changes, can all create "new" records in dumps like these.
Makes sense if the ID verification process involves scanning a driver license or passport.
Edit- rereading this, you’re obviously talking about scale. The original article is much better : https://cybernews.com/security/global-data-leak-exposes-bill...
This made me absolutely livid:
> We requested a security incident report from the ethical hackers as proof
So instead of paying him a fair bug bounty, they demand that he write a formal report for them and prove to them that there is even a problem.
Totally unhinged, but it gets worse:
> the response was a demand for money for the report, which confirmed our suspicion that this was a ransom-related incident.
Wow. So when the security researcher informs them that he would be happy to do some consulting work for them and informs them of his rates, they flip out and accuse his initial good samaritan decision to inform the company of the issue of being part of a plot by him to hold the company for ransom?
Whoever thought this is both totally delusional and a complete jerk. Truly, no good deed goes unpunished.
Nobody told their marketing department:
https://www.idmerit.com/blog/idmerits-data-breach-fail-safe-...
archived for posterity: https://archive.ph/MdSfO
> We own and operate our proprietary platform, but we do not own, control or store customer data or the underlying data maintained by independent data sources.
This seems like a critical sentence. Is this database actually operated by IDMerit, or someone else? If so, who?
Remember when you'd get a letter in the mail, "you identity has been compromised, here is a subscription to an identity monitoring service."
The system is broken. We shouldn't be so vulnerable because of foundational infrastructure.
While this leak may or may not have happened, for this type of exposure there should be criminal liability for developers and executives. Criminal negligence and prison time.
If developers are going to face criminal liability, they should IMHO also have legal ways to push back against certain implementations without risking their jobs, or at least have a way to leave a legal justification somewhere: "I'm doing this because I'm forced to but I disagree" which is then signed by management.
Until then, you're putting the weight of the law on the wrong side of the equation, since developers aren't the ones consciously making risky decisions.
Most countries already have whistleblower laws. If you are living somewhere that has any kind of "wrongful termination" legislation, an employer asking you to commit a crime is an open and shut case. I would guess that all of the USA and Europe would have existing sufficient protections, for example (although the US never ceases to surprise me).
They won’t do it just for protection.
The state would need to offer an award, and maybe witness relocation
Unrelated to the story but TIL AOL is still a thing in 2026!
Seems like it deserves to be its own post.
every age verification mandate creates another one of these databases. billion records, no password, plain text.
> That review identified no exposure, vulnerability or unauthorized access within the IDMERIT environment
The fact that they didn't vet their data providers then has to be considered a form of negligence. In the end, its the company I am handing over my details to to act responsibly, not their providers.
I hate this responsibility delegating when its not a good luck, and this will continue to get worse now as the entire internet will be ID gated soon. But don't worry, all the lapse in privacy and even security in the name of 'saving the kids'.
aol.com!?!?
Unprotected MongoDB, tables without password, data in plain text. It's a textbook example of doing absolutely everything wrong.
Yet another point of proof that the US needs a HIPAA covering PII.
Do such breaches make it trivial to lie to age and identity verification systems?
What did measures like gdpr ever achieve except for making me click a cookie prompt away.
Actual punitive measures taken against entities who e.g. manipulate personal data in a negligent way. [1]
Which was much harder to achieve before.
[1] https://www.enforcementtracker.com/
Right to be forgotten - you can ask companies to delete data they hold on you.
Data ownership/portability : you can ask companies for a copy of all data they hold on you or related to you.
I’ve seen the latter used by job applicants to get an entire copy of their interviews, transcripts and assessments including the reason for not being hired.
It's really a wonder how every time gdpr is even remotely related, there's always gotta be someone complaining about how gdpr is at fault for the cookie/data prompts, and never that sites and advertising companies (and their 2137 partners) are at fault for actually making those prompts as annoying as possible in hopes that you just agree.
In the UK open banking was essentially a response to GDPR this has allowed (to a limited extent) a variety of tools to be built on top of bank accounts that others would not have been.
That was actually the two Payment Services Directives: https://blog.finexer.com/guide-to-psd2-regulation-for-open-b...
It makes you aware a site is selling your data or is otherwise tracking you because otherwise they would not need a banner to request for consents to do so :)
Since people still seem to conflate the two, let me say it loud and clear:
GDPR HAS NOTHING TO DO WITH THE COOKIE PROMPTS!
GDPR doesn't apply in the states, but hopefully it provides for some punishment for the poor security here for EU customers. Of course, then some Americans will get mad that a US company has to follow EU law.
> Of course, then some Americans will get mad that a US company has to follow EU law.
This is always the way of the world though, if you want to do business anywhere, you are of course obligated to follow the local laws and regulations. I don't see anyone disputing this outside of blatant patent infringement by certain countries.
The GDPR applies worldwide to any data held about EU or UK citizens, regardless of where they reside. It does apply in the US, it's just potentially harder for the EU to enforce meaningful penalties for infractions.
> It does apply in the US
EU law does not apply to US citizens residing in the US with no ties to the EU.
Correct. It does not apply to US citizens residing anywhere in the world. It does, however, as I said, apply to EU citizens regardless of where in the world they reside.
If a company holds data about EU citizens, the GDPR applies to them, regardless of where that company is based. Including the US. Hence the statement "It (GDPR) does apply in the US" is completely correct.
It's written that way, perhaps.
But there's no jurisdictional reality that any of country/union A's rights will protect a person while they are present in country/union B.
In the same way that a US citizen does not have legal protection for free speech when present in, e.g. China, Saudi Arabia, or Germany.
Even if the EU got the text incorporated into the UN Universal Declaration of Human Rights, there are famously many countries who are not signatories (and it would require a locally-implemented actual law to support its recognition).
The EU can arrange post facto penalties for violations of their citizens' rights, to be (potentially) administered in the future, when a responsible entity enters EU jurisdiction, but absolutely not before then without cooperation by treaty with the nation where these foreign-and-not-real "rights" were violated. Which would be a surrender of sovereignty and basically unimaginable.
(No comment on the goodness or successfulness of the GDPR here, just that no part of it is relevant outside of the EU regardless of how the text is composed.)
(And this is all written with awareness that the US somehow manages to selectively enforce their laws extra-jurisdictionally in weak foreign nations. The EU is not the US, and the US is not weak.)
Just, that is why I wrote "it's just potentially harder for the EU to enforce meaningful penalties for infractions."
You premise is true in one sense, however, the point remains - the GDPR covers all EU citizens, regardless of where the company is based. For small US companies, sure the EU has very little power to enforce it, but larger companies that derive any revenue from the EU can be, and are, fined by the EU GDPR commissioners.
There is more information here: https://www.gdpradvisor.co.uk/does-gdpr-affect-us-companies or here: https://www.clarip.com/data-privacy/gdpr-united-states/ or here: https://www.usitc.gov/publications/332/executive_briefings/g... or here: https://dataprivacymanager.net/5-biggest-gdpr-fines-so-far-2... (that last one, 16 of the 20 biggest fines were for companies outside the EU)
I can't find the source, but Google's AI in the search results also claims that "EU GDPR fines for U.S. companies are significant, with U.S. firms facing roughly 83% of total GDPR fines, totaling over €4.68 billion by early 2025". That 83% figure seems unreasonably high to me, but it's possibly just a consequence of the size of the fine being based on worldwide revenue and over half of the 20 biggest fines were to Google and Meta.
I get it, but the operative point is not "potentially harder", but "literally impossible" to enforce -- unless the corp has some presence in the EU of course.
FWIW, I just checked Wikipedia to sanity-check my memory of our lawyers' guidance. Important differences from our discussion, if my read is correct:
GDPR does not apply to "EU citizens anywhere in the world", it applies to the personal data of "living persons ... inside the EU" or with data processed there.
(So GDPR would apply to a US citizen who is present in the EU, and/or being a user/customer of a vendor that operates in the EU)
From the "Misconceptions" section[0]:
(So GDPR would not apply to a EU citizen who is present in the US at the time of "processing", whether that's a service or product sale, etc)This is important to my company. We are US-based, but have EU citizens as customers. For regulatory reasons, we block customer activity from outside the US, and we are not able to comply with GDPR (but we do have to be aware of CCPA[1] which has some similarities).
[0] https://en.wikipedia.org/wiki/General_Data_Protection_Regula...
[1] https://en.wikipedia.org/wiki/California_Consumer_Privacy_Ac...
I'm not sure I agree with that interpretation, as "processing" is extremely broad and includes just storing the data.
Yes, technically if the EU citizen remains outside the EU for the entire lifecycle of the data up to and including deletion, then it isn't covered. But if you store that data at all when they have returned to the EU, then you need to comply with the GDPR in terms of handling that data.
Also, as a UK citizen (formally EU citizen), I don't understand why US-based countries are so against the GDPR, as essentially it's just a codification of how to do the morally best thing for your customers. Any data you don't need for a business purpose should be deleted as soon as possible. You can have any data about someone as long as there is a justifiable business reason for it. You have to let someone know what data you have about them (if they request it via a SAR) and you have to give them the information up front to determine if they are happy with you handling their data, via a clear privacy policy and opt-in to having their data used.
Complying with the GDPR is pretty straight forward, as long as your intention isn't to profit by selling or otherwise making use of people's data in ways that they wouldn't be comfortable with. If you aren't doing anything bad with user's data and already following good security practices, including deleting data that's no longer needed, then you are already compliant with the intent of the GDPR and going from that to full compliance is probably only adding processes to be able to handle an SAR.
I'd have to dig deeper, but generally "storing" (or maintaining storage) is not "processing" in the local US legal vernacular. Where both apply (PCI DSS, etc), both terms are used.
In my personal (business) case, we literally cannot comply with GDPR and also BSA/AML, FinCen, Reg E, KYC, etc, simultaneously. Our "business requirements" can last 7+ years, and our customers' wishes have no bearing on them.
And while we have no operations in any EU country, we are absolutely not obligated to even consider any EU laws about the data belonging to any of our customers, regardless of their citizenship. That's the primary point I'm making here -- the EU has zero jurisdiction over anything that happens outside the EU, ever, or any entities outside the EU, despite any claims to the contrary (which, according to Wikipedia at least, are not even made).
This is intuitive, but also the very expensive legal opinion of our lawyers, who have offices in the US, EU, and EMEA, for whatever that's worth!
In the general case, and as a customer, I'm fully in support of GDPR and CCPA-like protections. They're a great idea, I think! I'm usually the privacy nut in any discussion.
But compliance is obviously more work/expense than not, and small companies are especially allergic to nonproductive work and expenses. So naturally there's resistance to the suggestion that a foreign law compels them to do more of both.
And of course, if we're talking about the US, we have a very different culture around government and regulation. "As little as possible" (except those that protect my interests) is the preference of the landed gentry, and those who would aspire to same.
Reasonable people will recognize this as absurd, but ... you can't spell "absurd" without U, S, and A.
Absolutely agree, effectively the EU can't touch small companies, they'd only be able to touch you if you generated any revenue in the EU that they could intercept.
As for "the EU has zero jurisdiction over anything that happens outside the EU, ever, or any entities outside the EU, despite any claims to the contrary" again, this isn't true if you conduct any business in the EU. Even for a company domiciled outside the EU, they could compel your payment processors to seize all payments to you from EU entities, for instance. The degree to which they'd fight pushback would depend on how serious the violation was and the size of the company, but you can be sure for instance that even if say Google had no EU presence at all, that the EU would make sure they complied with the GDPR or else ban them from the EU entirely.
In your situation, you actually have a good defence if there ever was an EU citizen trying to use the GDPR against you, because by taking efforts to not allow sales outside the US, you can argue that your services were never for sale to EU citizens. I guess if your TOS also said your service wasn't available to EU citizens, it would be even more watertight.
But just FYI for this: > In my personal (business) case, we literally cannot comply with GDPR and also BSA/AML, FinCen, Reg E, KYC, etc, simultaneously. Our "business requirements" can last 7+ years, and our customers' wishes have no bearing on them.
I don't know the requirements of all those other things, but contrary to what a lot of people believe - you CAN store whatever you need to if there is a justifiable business reason for it, and you don't have to delete it even if a customer requests you to if there is a valid business requirement, such as regulatory or statutory compliance. The GDPR just compels you to be transparent with the data subject about what data is stored in those cases.
Almost every business in the EU is required to hold tax records of sales for 5 years, and so obviously these must be retained even after a customer stops being a customer and even if they request deletion of their PII data. What the GDPR requires is that you only keep the minimum required data to fulfill those statutory requirements, and delete anything else, and also not to use that data for any other purpose. Regular data should be deleted as soon as it is no longer needed.
I haven't studied the CCPA in depth, but my understanding is that it's very similar in scope to the GDPR and that complying with one would get you almost all the way to compliance with the other.
I also understand the general reaction against being told what to do by some other extra-territorial entity, but in today's society of cross-border trade, it's usually inescapable, apart from when they directly contradict - e.g. requirements to only store data in one territory.
undefined
This is actually a Fox News article and as far as I can see it's not corroborated anywhere.
I saw a reddit thread about it earlier where someone said the apparent hacker refused to actually show any of the data and was asking for money. So probably just a scam rather than a real leak.
The Fox article just cites CyberNews.[0]
Cybernews posts screenshots[1] featuring usernames like idmKYCCN and idmKYCFR, and the ports were locked down after contacting ID Merit.
I think thay what's happened is that everyone is telling the literal truth and speaking very carefully to use that truth to obscure rather than inform. To hell with the victims. The way I intrerpet this is that their denials are both factually accurate AND misleading.
The partner who said there is "no indication that any customer data has been compromised" is telling the literal truth. They can't find any indicators because they stink at logging and the screenshots posted on CyberNews obscure the customer info intentionally. Instead Cyber News only shows the IDM usernames in plaintext. Which was the responsible thing to do They literally cant see any indications... of customer data... because they dont have logs.
It should also be noted that the Partners customer in this case is likely ID Merit... not the people whose information was stolen. So again, their statement was literally true even if they do find evidence of a billion records being leaked.
Nobody should ever trust anyone involved in this again if I'm correct in this interpretation of the available facts.
[0] https://www.foxnews.com/tech/1-billion-identity-records-expo...
[1] https://cybernews.com/security/global-data-leak-exposes-bill...