• Platform
  • Pricing
  • About
  • News
  • Blog
  • Help
  • Login
  • Book Now

Solutions

Explore everything you need to protect your workforce.

Ai security

Shadow SaaS
Data Leakage
Real-Time User Guidance
Phishing Protection

Browser security

Browser Observability
Authentication Governance
Back to News

Forbes: Shadow AI Has Already Moved Into Your Organization

On This Page
TOC Element
Share:
Neon Cyber
Published on: 
March 19, 2026

Published originally on March 19, 2026 byTony Bradley on Forbes.

‍

Somewhere in your organization right now, someone is pasting customer data into an AI tool you’ve never approved.

Not long ago, the challenge was employees spinning up cloud services, personal Dropbox accounts and unauthorized SaaS tools without telling IT. That was a headache. What’s happening now is a different problem at a different scale.

Today, an employee can open a browser tab, paste in sensitive customer data and have a conversation with an AI model the company has never heard of—in about 30 seconds, with no installation, no approval process and no visibility into what just happened.

And the instinct a lot of organizations had early on—just block the tools—isn’t a real answer to that.

I had an opportunity to chat with Mark St. John and Cody Pierce, co-founders of browser security company Neon Cyber, about how organizations are—or aren’t—dealing with the explosion of AI tools in the workplace. Mark put it plainly: “Not that users are nefarious, but they’re going to find a way to use the tools that they want. And so having a way to find them is critical.”

...

For every tool a security team blocks, there are ten others they haven’t heard of yet. The AI tool landscape expands weekly. New models, new interfaces, new browser extensions, new workflows—all showing up faster than any policy team can track.

What makes this harder is who’s actually driving the adoption. Most average employees are content to stick to the sanctioned tools provided. But the curious, technically savvy employees—the ones actively exploring new capabilities—know just enough to find these tools but may not fully understand the data exposure risks they’re creating.

Cody addressed this directly: “You really want something that adapts quickly. You have to have something that kind of understands that, and then will give options to allow it or disallow.” The idea isn’t to stop employees from experimenting—it’s to maintain some visibility and control while that experimentation happens.

‍

Read the full article on Forbes.

Protect the people that power your business

Subscribe to the Neon Glow-Up

Subscribe
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Follow Us

Company

Platform
About us
News
Blog

Platform

Browser Observability for SecOps & GRC
AI & Shadow SaaS Visibility and Control
AI Data Leakage & Insider Risk
AI Guardrails & Real-time User Guidance
AI-Powered Phishing & Social Engineering Defense
Authentication & Identity Hygiene
© {{year}} Copyright. All Rights Reserved.
Privacy Policy
Terms and Conditions