← Resources

Oversharing Perspective: Why Accidental Data Exposure Is GenAI’s Biggest Risk in the Enterprise

Industry Insights
Blog

Welcome to the first edition of Oversharing Perspective — where we dive into the realities of deploying Generative AI in the enterprise, one conversation at a time.

To kick things off, I sat down with Mike D’Arezzo, Executive Director of Information Security and GRC at Wellstar Health, to talk about one of the most critical — and often overlooked — challenges in GenAI adoption: accidental data exposure.

Why Oversharing Matters More Than Ever

Mike didn’t hold back:

“My biggest concern is data — how it gets into GenAI tools, who can access it, and whether we can pull it back if it leaks.”

In healthcare, where HIPAA governs every byte of sensitive information, this concern is existential. But Mike made it clear: oversharing isn’t just a healthcare issue — it’s a universal one.

From Use Cases to Guardrails

One of Mike’s strongest messages was about purposeful adoption. Without clear direction, GenAI becomes a high-risk, low-reward tool. To avoid that trap:

  • Start with clear goals and success metrics
  • Involve legal and compliance stakeholders early
  • Build an AI Governance Council to ensure alignment across teams

This isn’t about blocking GenAI — it’s about empowering safe, intentional use.

Real Risks of Oversharing with Copilot

Mike shared some real-world scenarios:

  • Uploading PHI into GenAI tools by accident
  • SharePoint folders defaulting to organization-wide access
  • Users asking Copilot for data they shouldn’t have access to

“Every Microsoft tool is built for collaboration by default — and that makes oversharing way too easy.”

Oversharing is rarely malicious — but the consequences can be significant.

Security Is a Team Sport

For Mike, GenAI security must extend beyond IT. It requires:

  • Education through sessions and internal awareness programs
  • Engagement from both champions and skeptics
  • Policies that adapt with the pace of technology and risk

His advice to other security leaders: involve the people who challenge you. Their pushback can strengthen your safeguards.

What’s Next for GenAI in Healthcare?

Mike sees GenAI as just the beginning. The next frontier? Agentic AI — systems that don’t just generate, but act. He believes we’ll see:

  • AI building processes and systems on demand
  • Natural language interfaces becoming the new operating layer
  • Identity and access control as mission-critical infrastructure

“You and I could ask Copilot the same question and get different answers — because of our roles. That’s how deep identity has to go.”

Final Thoughts

GenAI is powerful, fast, and full of potential — but without the right guardrails, it can quickly become a liability.

Mike left us with a powerful reminder:

“We need to build the firewalls now — not after we’ve already burned the house down.”

Have thoughts or questions? Drop a comment or subscribe for next week’s conversation with Lisa Choi from Cascade Environmental, where we’ll explore how they’re preparing their workforce for Copilot — without oversharing a byte.


Until then, keep sharing knowledge — not sensitive data.
James

About the Author

James Pham is the Co-Founder and CEO of Opsin, with a background in machine learning, data security, and product development. He previously led ML-driven security products at Abnormal Security and holds an MBA from MIT, where he focused on data analytics and AI.

Oversharing Perspective: Why Accidental Data Exposure Is GenAI’s Biggest Risk in the Enterprise

Welcome to the first edition of Oversharing Perspective — where we dive into the realities of deploying Generative AI in the enterprise, one conversation at a time.

To kick things off, I sat down with Mike D’Arezzo, Executive Director of Information Security and GRC at Wellstar Health, to talk about one of the most critical — and often overlooked — challenges in GenAI adoption: accidental data exposure.

Why Oversharing Matters More Than Ever

Mike didn’t hold back:

“My biggest concern is data — how it gets into GenAI tools, who can access it, and whether we can pull it back if it leaks.”

In healthcare, where HIPAA governs every byte of sensitive information, this concern is existential. But Mike made it clear: oversharing isn’t just a healthcare issue — it’s a universal one.

From Use Cases to Guardrails

One of Mike’s strongest messages was about purposeful adoption. Without clear direction, GenAI becomes a high-risk, low-reward tool. To avoid that trap:

  • Start with clear goals and success metrics
  • Involve legal and compliance stakeholders early
  • Build an AI Governance Council to ensure alignment across teams

This isn’t about blocking GenAI — it’s about empowering safe, intentional use.

Real Risks of Oversharing with Copilot

Mike shared some real-world scenarios:

  • Uploading PHI into GenAI tools by accident
  • SharePoint folders defaulting to organization-wide access
  • Users asking Copilot for data they shouldn’t have access to

“Every Microsoft tool is built for collaboration by default — and that makes oversharing way too easy.”

Oversharing is rarely malicious — but the consequences can be significant.

Security Is a Team Sport

For Mike, GenAI security must extend beyond IT. It requires:

  • Education through sessions and internal awareness programs
  • Engagement from both champions and skeptics
  • Policies that adapt with the pace of technology and risk

His advice to other security leaders: involve the people who challenge you. Their pushback can strengthen your safeguards.

What’s Next for GenAI in Healthcare?

Mike sees GenAI as just the beginning. The next frontier? Agentic AI — systems that don’t just generate, but act. He believes we’ll see:

  • AI building processes and systems on demand
  • Natural language interfaces becoming the new operating layer
  • Identity and access control as mission-critical infrastructure

“You and I could ask Copilot the same question and get different answers — because of our roles. That’s how deep identity has to go.”

Final Thoughts

GenAI is powerful, fast, and full of potential — but without the right guardrails, it can quickly become a liability.

Mike left us with a powerful reminder:

“We need to build the firewalls now — not after we’ve already burned the house down.”

Have thoughts or questions? Drop a comment or subscribe for next week’s conversation with Lisa Choi from Cascade Environmental, where we’ll explore how they’re preparing their workforce for Copilot — without oversharing a byte.


Until then, keep sharing knowledge — not sensitive data.
James

Get Your
Blog
Now
Your Name*
Business Email*
Your
Blog
is ready!
Please check for errors and try again.

Secure Your GenAI Rollout

Find and fix oversharing before it spreads
Book a Demo →