{
	"id": "74fff9ee-678b-4967-b928-35cfca134e9e",
	"created_at": "2026-04-06T00:19:09.268508Z",
	"updated_at": "2026-04-10T03:21:15.626119Z",
	"deleted_at": null,
	"sha1_hash": "c2470feee44d9e461c5e38147853e71e38933b50",
	"title": "Internet Crime Complaint Center (IC3)",
	"llm_title": "",
	"authors": "",
	"file_creation_date": "0001-01-01T00:00:00Z",
	"file_modification_date": "0001-01-01T00:00:00Z",
	"file_size": 42069,
	"plain_text": "Internet Crime Complaint Center (IC3)\r\nPublished: 2024-12-03 · Archived: 2026-04-05 23:41:46 UTC\r\nThe FBI is warning the public that criminals exploit generative artificial intelligence (AI) to commit fraud on a\r\nlarger scale which increases the believability of their schemes. Generative AI reduces the time and effort criminals\r\nmust expend to deceive their targets. Generative AI takes what it has learned from examples input by a user and\r\nsynthesizes something entirely new based on that information. These tools assist with content creation and can\r\ncorrect for human errors that might otherwise serve as warning signs of fraud. The creation or distribution of\r\nsynthetic content is not inherently illegal; however, synthetic content can be used to facilitate crimes, such as fraud\r\nand extortion.1 Since it can be difficult to identify when content is AI-generated, the FBI is providing the\r\nfollowing examples of how criminals may use generative AI in their fraud schemes to increase public recognition\r\nand scrutiny.\r\nAI-Generated Text\r\nCriminals use AI-generated text to appear believable to a reader in furtherance of social engineering,2 spear\r\nphishing,3 and financial fraud schemes such as romance, investment, and other confidence schemes or to\r\novercome common indicators of fraud schemes.\r\nCriminals use generative AI to create voluminous fictitious social media profiles used to trick victims into\r\nsending money.\r\nCriminals create messages to send to victims faster allowing them to reach a wider audience with\r\nbelievable content.\r\nCriminal use generative AI tools to assist with language translations to limit grammatical or spelling errors\r\nfor foreign criminal actors targeting US victims.\r\nCriminals generate content for fraudulent websites for cryptocurrency investment fraud and other\r\ninvestment schemes.\r\nCriminals embed AI-powered chatbots in fraudulent websites to prompt victims to click on malicious links.\r\nAI-Generated Images\r\nCriminals use AI-generated images to create believable social media profile photos, identification documents, and\r\nother images in support of their fraud schemes.\r\nCriminals create realistic images for fictitious social media profiles in social engineering, spear phishing,\r\nromance schemes, confidence fraud, and investment fraud.\r\nCriminals generate fraudulent identification documents, such as fake driver's licenses or credentials (law\r\nenforcement, government, or banking) for identity fraud and impersonation schemes.\r\nCriminals use generative AI to produce photos to share with victims in private communications to convince\r\nvictims they are speaking to a real person.\r\nhttps://www.ic3.gov/PSA/2024/PSA241203\r\nPage 1 of 3\n\nCriminals use generative AI tools to create images of celebrities or social media personas promoting\r\ncounterfeit products or non-delivery schemes.4\r\nCriminals use generative AI tools to create images of natural disaster or global conflict to elicit donations\r\nto fraudulent charities.\r\nCriminals use generative AI tools to create images used in market manipulation schemes.\r\nCriminals use generative AI tools to create pornographic photos of a victim to demand payment in\r\nsextortion schemes.\r\nAI-Generated Audio, aka Vocal Cloning\r\nCriminals can use AI-generated audio to impersonate well-known, public figures or personal relations to elicit\r\npayments.\r\nCriminals generate short audio clips containing a loved one's voice to impersonate a close relative in a\r\ncrisis situation, asking for immediate financial assistance or demanding a ransom.\r\nCriminals obtain access to bank accounts using AI-generated audio clips of individuals and impersonating\r\nthem.\r\nAI-Generated Videos\r\nCriminals use AI-generated videos to create believable depictions of public figures to bolster their fraud schemes.\r\nCriminals generate videos for real time video chats with alleged company executives, law enforcement, or\r\nother authority figures.\r\nCriminals create videos for private communications to \"prove\" the online contact is a \"real person.\"\r\nCriminals use generative AI tools to create videos for fictitious or misleading promotional materials for\r\ninvestment fraud schemes.\r\nTips to protect yourself\r\nCreate a secret word or phrase with your family to verify their identity.\r\nLook for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic teeth or\r\neyes, indistinct or irregular faces, unrealistic accessories such as glasses or jewelry, inaccurate shadows,\r\nwatermarks, lag time, voice matching, and unrealistic movements.\r\nListen closely to the tone and word choice to distinguish between a legitimate phone call from a loved one\r\nand an AI-generated vocal cloning.\r\nIf possible, limit online content of your image or voice, make social media accounts private, and limit\r\nfollowers to people you know to minimize fraudsters' capabilities to use generative AI software to create\r\nfraudulent identities for social engineering.\r\nVerify the identity of the person calling you by hanging up the phone, researching the contact of the bank\r\nor organization purporting to call you, and call the phone number directly.\r\nNever share sensitive information with people you have met only online or over the phone.\r\nDo not send money, gift cards, cryptocurrency, or other assets to people you do not know or have met only\r\nonline or over the phone.\r\nhttps://www.ic3.gov/PSA/2024/PSA241203\r\nPage 2 of 3\n\nIf you believe you have been a victim of a financial fraud scheme, please file a report with the FBI's Internet\r\nCrime Complaint Center at www.ic3.gov. If possible, include the following:\r\nIdentifying information about the individuals including name, phone number, address, and email address.\r\nFinancial transaction information such as the date, type of payment, amount, account numbers involved,\r\nthe name and address of the receiving financial institution, and receiving cryptocurrency addresses.\r\nDescribe your interaction with the individual, including how contact was initiated, such as the type of\r\ncommunication, purpose of the request for money, how you were told or instructed to make payment, what\r\ninformation you provided to the scammer, and any other details pertinent to your complaint.\r\nSource: https://www.ic3.gov/PSA/2024/PSA241203\r\nhttps://www.ic3.gov/PSA/2024/PSA241203\r\nPage 3 of 3",
	"extraction_quality": 1,
	"language": "EN",
	"sources": [
		"MITRE"
	],
	"references": [
		"https://www.ic3.gov/PSA/2024/PSA241203"
	],
	"report_names": [
		"PSA241203"
	],
	"threat_actors": [],
	"ts_created_at": 1775434749,
	"ts_updated_at": 1775791275,
	"ts_creation_date": 0,
	"ts_modification_date": 0,
	"files": {
		"pdf": "https://archive.orkl.eu/c2470feee44d9e461c5e38147853e71e38933b50.pdf",
		"text": "https://archive.orkl.eu/c2470feee44d9e461c5e38147853e71e38933b50.txt",
		"img": "https://archive.orkl.eu/c2470feee44d9e461c5e38147853e71e38933b50.jpg"
	}
}