AI supercharges cybercrime — scams become faster, cheaper and harder to detect

AI is rapidly transforming cybercrime by enabling highly convincing, personalized scams at scale. Security researchers estimate 50–75% of global phishing and spam now originate from AI systems that mimic company tone, reference public events and produce realistic voice and video impersonations. Dark web markets sell AI-powered hacking tools and subscriptions (e.g., WormGPT, FraudGPT, DarkGPT) with tiered pricing and support for as little as ~$90/month, lowering the barrier to entry. Experts from Carnegie Mellon, Google Threat Intelligence Group and industry firms (Anthropic, OpenAI, Google, Darktrace) warn criminal groups can automate targeting, reconnaissance and payload creation, making operations faster, smaller and more profitable. While fully autonomous attacks remain limited, AI has already replicated complex breaches in lab tests. Defenders are also using AI to scan code and find vulnerabilities, but human oversight remains required. For traders: increased AI-driven phishing and fraud raise operational risk for exchanges, custodians and DeFi platforms, potentially increasing short-term volatility if high-profile breaches occur and prompting tighter regulation and security spending in the medium term.
Bearish
AI-enhanced cybercrime raises operational and counterparty risk across crypto markets. More convincing phishing, voice/video impersonation and low-cost AI attack tools increase the probability of successful hacks and social-engineering attacks on exchanges, custodians and major DeFi protocols. Historical precedents show that prominent security breaches (exchange hacks, major wallet drains) cause immediate sell-offs, liquidity withdrawals and heightened volatility. In the short term, markets may react negatively to any reported AI-driven incidents, causing price dips and increased spreads. In the medium term, the sector will likely face higher compliance and security costs, possible regulatory interventions, and concentrated liquidity as custodial risk is repriced — all factors that are typically bearish. Offsetting factors include accelerated investment in security and AI-driven defensive tools; these reduce long-term systemic risk but do not eliminate near-term headwinds.