Data farming operation subtly targets customer-facing companies

January 8, 2025
Data Farming Operation Dark Web Threat Alert Cybercriminals Social Engineering

A newly uncovered data farming operation on the dark web has targeted various customer-facing firms. As of now, these organisations should upscale their authentication processes as these threats could inflict significant damage to concerned individuals.

One researcher discovered that an anonymous underground cybercriminal group harvested massive numbers of identity documents and face photos to defraud Know Your Customer (KYC) verification processes.

However, a separate investigation alleged that the data subjects may have willingly provided the photographs and documents in exchange for cash.

This discovery shows the additional issue for companies that require selfies to authenticate a customer’s identity online. Hence, authorities advise these entities that they must detect not just false documents, but also legitimate ones used by malicious individuals.

 

The data farming operation commonly targets people who voluntarily provide their identification.

 

Investigations show that the technical nature of this data farming operation targets people who knowingly surrender their identities for short-term financial gain.

Researchers explained that once people sell their identity documents and biometric data, they risk their financial security and supply criminals with entire, legitimate identity packages.

These personal details could allow the threat actors to orchestrate other malicious attacks using legitimate identification. Some samples of these operations include social engineering, identity fraud, and phishing attacks.

On the other hand, the researchers who discovered the campaign alleged that its operators originate from Latin America. A similar discovery claimed that the threat actors came from Eastern Europe.

Furthermore, legitimate identity documents are not the only method cybercriminals use to bypass onboarding and login verification processes. Last month, law enforcement agencies warned the public about AI-powered deepfakes, which now account for one-quarter of fraudulent efforts to pass motion-based biometrics checks. Banks and other service providers commonly employ these bio-checks to verify individuals.

Still, the technology is employed significantly less commonly in simpler selfie-based authentication schemes, primarily because they are easier to fake using more traditional methods.

Organisations should be more critical of their identity requirements as cybercriminals constantly find ways to steal information.

About the author