Public by Default vs HIPAA: Lessons from the Lovable April 2026 Disclosure

Konstantin Kalinin
Apr 30, 2026 • 8 min read
Expert Verified
Share this post
Table of content

On April 22, Lovable published a post-mortem on a security incident that ran from February 3 to April 20, 2026. During those two and a half months, a backend regression made the chat history and source code of public Lovable projects accessible to any authenticated user with a project link.

The company shipped a fix within two hours of the public report, made all current public projects private, and laid out a series of process changes around its HackerOne triage workflow. Lovable Cloud and private projects were not affected.

That is a serious incident for any user, but for anyone building a healthcare app on a vibe coding platform, it is something more pointed. It is a real-world demonstration of why the public-by-default posture that general-purpose vibe coding tools have leaned on for community and discoverability is structurally incompatible with HIPAA.

This is not a Lovable problem. It is a category problem. Lovable just happens to be the company that disclosed it first, and disclosed it well.

Can I use Lovable to build a HIPAA-compliant healthcare app?

Not currently. Lovable's public legal docs tell users not to upload protected health information under HIPAA, and its April 2026 incident showed that public-by-default project visibility creates the kind of regression risk HIPAA's Security Rule is designed to prevent. For HIPAA-grade AI development, choose a platform built private-by-default with a clear BAA path in place, such as Specode.

Key Takeaways

  1. From February 3 to April 20, 2026, a backend regression at Lovable made the chat history and source code of public projects readable by any authenticated user. Lovable Cloud and private projects were not affected.

  2. The lesson is not Lovable-specific. Public-by-default visibility on any vibe coding platform creates a regression surface that HIPAA's access, audit, and risk analysis controls cannot tolerate.

  3. Healthcare apps need a platform that is private by default and willing to sign a BAA. Careful manual settings on a general-purpose tool are always one mistake away from a breach.

What Was Actually Exposed

The Lovable post is specific about scope: chat history and source code of public projects. To understand why that matters in healthcare, it helps to think about what those two artifacts typically contain when someone is vibe coding a clinical or wellness app.

lovable data exposure

What's in the Chat History

Chat history in a tool like Lovable, Bolt, Replit, or Base44 is rarely just casual conversation. It is the build log. Inside, you typically find:

  • Schema designs and data models
  • Sample patient records pasted in to help the AI understand the data shape
  • Snippets of de-identified, or sometimes not-so-de-identified, test data
  • Discussions of authentication flows
  • The reasoning behind every architectural choice

What's in the Source Code

Source code, meanwhile, contains:

  • Database structure
  • API integrations with third party services
  • The implementation of access controls
  • Occasionally, hardcoded secrets that someone meant to clean up later

Why This Matters in Healthcare

Both of those artifacts are exactly what a motivated attacker would want. They are also exactly what a Business Associate Agreement is supposed to wrap legal protection around. If either contained PHI during the exposure window, the user, not Lovable, would be the one responsible for breach notification under 45 CFR 164.400 to 414. And based on Lovable's public legal docs, there is no clear BAA path to fall back on. In fact, those docs tell users not to upload protected health information under HIPAA in the first place.

Mapping the Incident to the HIPAA Security Rule

Set Lovable aside for a moment and look at the failure mode itself:

  • A backend change re-enabled an access path that had been deliberately closed
  • Valid researcher reports were closed without being escalated, because internal triage documentation was out of date
  • The exposure ran for over two months before being caught

The HIPAA Security Rule is essentially a list of administrative, physical, and technical safeguards designed to prevent exactly this sequence of events. Four sections are particularly relevant here.

Access Controls: 45 CFR 164.312(a)(1)

The access control standard requires technical policies and procedures that allow access to electronic PHI only to authorized persons. "Anyone with a project link who happens to be logged in" is the opposite of that.

Audit Controls: 45 CFR 164.312(b)

Audit controls require mechanisms that record and examine activity in systems that contain or use ePHI. Lovable has said it is reviewing which public projects were viewed by users other than the owners during the exposure window. The fact that this is a forensic exercise after the fact, rather than something the platform was actively monitoring, is exactly the kind of gap the audit control standard exists to close.

Risk Analysis: 45 CFR 164.308(a)(1)(ii)(A)

The risk analysis standard requires an accurate and thorough assessment of the potential risks and vulnerabilities to ePHI. A meaningful risk analysis on a platform that hosts healthcare data would have flagged "public projects" as a risk surface long before a regression reintroduced the issue, because the failure mode is obvious the moment the threat model includes "what if this default flips by accident."

Evaluation: 45 CFR 164.308(a)(8)

The evaluation standard requires periodic technical and non-technical evaluation of how well security policies still meet the requirements of the rule. This is the standard that, in spirit, would have caught the regression. A formal change-management review tied to a HIPAA risk analysis would have flagged a backend change that altered who could see source code and chat history. The change-management muscle has to exist, and it has to fire on every relevant deploy.

This Is Not a Takedown of Lovable's Engineering

They caught it, fixed it, and wrote a clear public account of what went wrong. The point is narrower: a platform that is not built around the Security Rule will not consistently catch these failures, because the controls that catch them are not on the critical path.

The HackerOne Piece Is the Bigger Lesson

Buried in the middle of the Lovable post is a detail that deserves more attention than it has gotten. Multiple researchers reported the issue through HackerOne starting on February 22.

All reports were closed without being escalated, because the static documentation Lovable had given its triage partners still described public chat visibility as intended behavior.

A Process Failure, Not a Technical One

The product had moved on, the documentation had not, and the gap between the two meant that the people whose job was to catch this issue were actively told to ignore it.

What HIPAA's Administrative Safeguards Would Have Caught

For a HIPAA-regulated environment, this is the kind of failure that the administrative safeguards in 45 CFR 164.308 are designed to make impossible. The relevant controls are all variations on forcing the question "does our process still match what our system actually does?":

  • Risk analysis
  • Workforce training
  • Information system activity review
  • Periodic evaluation

In a general-purpose vibe coding platform, that question is asked when something goes wrong. In a HIPAA-grade platform, it is asked on a schedule.

Why Public-by-Default Cannot Survive Contact with Healthcare

Lovable's public-projects feature was not a bug. It was a deliberate product choice that made early Lovable feel alive:

  • See what other people were building
  • Remix their projects
  • Learn the platform by reading other people's prompts

That is a great fit for hobbyist projects, marketing sites, and indie SaaS prototypes.

Why It Breaks in Healthcare

It is a terrible fit for healthcare, because the moment a developer decides to test their app with even one piece of realistic data, the public surface area becomes a HIPAA problem. The minimum necessary principle that runs through the Privacy Rule has no equivalent in a platform whose default is community discovery.

Deeper dive: Vibe Coding in Healthcare

Even if a healthcare developer always picks "private" manually, they are one mistake away from a disclosure:

  • A UI change that flips the default
  • A backend regression that re-enables access
  • A teammate who toggles visibility without realizing what it controls
gap between vibe coding defaults and HIPAA

The Category Lesson

This is the underlying reason Lovable's incident is bigger than Lovable. Any platform that treats public visibility as a first-class feature has to defend it against regressions on every deploy, forever. Any platform that does not have public visibility in the product at all has one fewer thing to defend.

The Simpler Version of the Argument

There is also a simpler version of this that does not require any HIPAA case law to land. If a platform's own legal docs tell users not to upload PHI, careful visibility settings are not a workaround. Anything that touches real PHI needs a clear BAA path before it belongs on that platform.

The Lovable disclosure is a useful prompt to ask the question, but the question itself is older: is the platform you are building on willing to be a business associate for this use case, and do its public terms actually allow PHI?

Deeper dive in: Is Lovable HIPAA Compliant?

What to Look for If You Are Building Healthcare on AI

If you are evaluating a vibe coding platform for a healthcare app, the Lovable disclosure gives you a useful checklist. Ask the platform team:

  • Is project visibility private by default, with no public option on healthcare-tier accounts?
  • Will the platform sign a BAA, and what is in scope?
  • Are there audit logs for project, chat history, and source code access?
  • Is there a documented change-management process for backend changes that affect access control?
  • Is there an active vulnerability disclosure program, with triage trained on current product behavior?
  • Are encryption at rest and in transit independently attested?
  • Are role-based access controls available for teams?

If the honest answer to most of those is "not yet," the platform is not ready to host anything that will eventually touch PHI, regardless of how good the AI is at generating code.

What to Do This Week If You Have Been Prototyping on Lovable

If you have been building a healthcare-adjacent app on Lovable during the exposure window, work through these three as soon as possible:

  • Audit your chat history. Look for PHI, real names, emails, or sample records that resemble real patients. Remove anything that should not have been there.
  • Rotate every credential. API keys, database passwords, and third party tokens that appear anywhere in your project or build logs.
  • Reassess the platform itself. Decide whether the platform you are on is the right place to ship something that will eventually have a BAA attached to it, or whether the prototype needs to migrate before it sees a real user.

If Real PHI Was Actually Exposed

If your audit suggests that real PHI was present in chat history or source code during the window, the situation is more serious. Under 45 CFR 164.404, a covered entity has 60 days from the discovery of a breach to notify:

  • Affected individuals
  • The Secretary of HHS
  • The media, depending on scale

The clock starts on the date the breach is known or, by exercising reasonable diligence, would have been known. That is a calendar to take seriously. If you are in this situation, the right move is to talk to your privacy counsel this week, not next month.

The Deeper Lesson

Lovable handled the disclosure responsibly, and their post-mortem is worth reading in full as a reference for how to communicate this kind of incident. The deeper lesson is not about them. It is that a vibe coding platform built for general-purpose creativity will always carry assumptions that healthcare cannot live with, and the gap between those assumptions and the Security Rule will eventually be tested by a regression like this one.

Where Specode Fits

Specode is the version of this category that is built the other way around:

  • Private by default
  • BAA available
  • Healthcare-specific access controls
  • A security model that assumes adversarial review from day one

If you are building a clinical, wellness, or digital therapeutics app and you want AI-assisted development without the HIPAA exposure, that is the kind of foundation you want underneath the product.

Frequently asked questions

What was exposed in the Lovable April 2026 incident?

Chat history and source code of public Lovable projects, accessible to any authenticated user with a project link, between February 3 and April 20, 2026.

Were Lovable Cloud and private projects affected?

No. Lovable confirmed that the regression only affected public projects. Cloud workloads and private projects remained inaccessible to other users throughout the exposure window.

Does this trigger HIPAA breach notification?

Only if real PHI was present in the affected project during the window. If so, 45 CFR 164.404 gives you 60 days to notify affected individuals.

Can I make Lovable HIPAA-compliant with careful settings?

Not for PHI use. Lovable's privacy policy, terms of service, and DPA all instruct users not to upload protected health information. Without a clear BAA path and HIPAA-friendly terms, careful visibility settings do not change that.

What should a HIPAA-grade vibe coding platform offer?

Private by default, BAA available, audit logs, documented change management, encryption attestations, role-based access, and an active vulnerability disclosure program with current triage documentation.

Share this post
The Smarter Way to Launch Healthcare Apps
A strategic guide to avoiding expensive mistakes
You have a healthcare app idea.
But between custom development, off-the-shelf platforms, and everything in between—how do you choose the right path without burning through your budget or timeline?
Get your strategic guide
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Most Healthcare Apps Never Launch

The statistics are sobering for healthcare founders:
67%
Go over budget
4-8x
Longer than planned
40%
Never reach users

What if there was a smarter approach?

This blueprint reveals the decision framework successful healthcare founders use to choose the right development path for their unique situation.
What this guide talks about?
The real cost analysis: Custom vs. Platform vs. Hybrid approaches
Decision framework: Which path fits your timeline, budget, and vision
8 week launch plan from idea to launch and beyond
HIPAA compliance roadmap that doesn't slow you down
Case studies: How real founders navigated their build decisions
Red flags to avoid in vendors, platforms, and development teams