FAQ — UDI implementation challenges (EU + CH)

answers to common UDI questions. No legal advice. Always verify against official sources.

What this FAQ is (and is not)

  • : We discuss patterns and failure modes without exposing internal methods, client data, or “how-to bypass” tactics.
  • Standards-informed: We align terminology with common regulatory practice (e.g., UDI-DI vs UDI-PI, Basic UDI-DI, issuing entities).
  • Not legal advice: Final interpretation depends on device type, classification, and your notified body / authority expectations.
Why do UDI projects fail even when “the label looks fine”?

Because UDI is not only a label carrier. Most failures are data governance failures: inconsistent master data, uncontrolled changes, missing evidence, and broken handovers between RA, PLM, ERP, labeling, and suppliers.

  • Symptoms: duplicate identifiers, mismatched device records across systems, late packaging changes, or “last-minute spreadsheet last‑minute firefighting”.
  • What we do: we reduce ambiguity with controlled inputs, deterministic checks, and audit-friendly outputs.
What is the practical difference between Basic UDI-DI, UDI-DI and UDI-PI?

Basic UDI-DI groups related devices at a higher level (e.g., a family). UDI-DI identifies a specific device/version/configuration. UDI-PI carries production info (lot/serial, expiry, etc.).

The “gotcha” is change control: certain changes trigger a new UDI-DI, others don’t. Projects fail when teams treat this as a naming problem instead of a controlled decision.

Which change types typically force a new UDI-DI (or a new Basic UDI-DI)?

It depends on the rule set and issuing entity guidance, but high-risk triggers usually include device identity/variant changes, intended purpose changes, critical design/config changes, and packaging level changes that alter the identifier meaning.

We handle this as a decision record: change → rule reference → outcome → evidence.

Why is “legacy device” handling so painful?

Because legacy portfolios often have gaps: missing historical attributes, inconsistent packaging definitions, and labels that evolved without traceable decisions. Also, “what exists in reality” may not match what exists in ERP/PLM.

  • Typical pitfalls: incomplete packaging hierarchies, ambiguous variants, untracked relabeling, missing IFU/language linkage.
  • Value: a controlled reconciliation plan prevents rework and audit surprises.
What about kits, procedure packs and system packs?

These are frequent sources of confusion because the “device identity” is defined by composition, intended purpose, and packaging logic. The challenge is to keep composition changes traceable without exploding the identifier space.

We focus on clean definitions, controlled composition records, and evidence of what was declared and when.

Where do most data quality issues hide?

In the seams between systems: PLM↔ERP↔Labeling↔Supplier data, and in free-text fields that become “semi-structured” by accident.

  • Example failure modes: inconsistent units, date formats, language variants, missing UoM, stale packaging levels.
  • Our stance: on ambiguity. If a field cannot be trusted, it must be flagged—not “auto-fixed”.
What’s the real work behind EUDAMED / swissdamed readiness?

Portals are the last mile. The real work is creating defensible, consistent device data and controlling the pipeline from source systems to submission.

  • Readiness means: structured data, validation checks, evidence, and repeatable packaging—not a one-off upload.
How do you avoid “portal guesswork” and last-minute surprises?

We pin versions, keep deterministic validation, and treat every submission as a reproducible release: inputs → checks → outputs → manifest.

When portals change, we update the checkset and rerun checks. No manual patching behind the scenes.

How do you prove compliance without leaking private details?

By separating structure from sensitive content. We can show artifacts (schemas, manifests, acceptance criteria) while keeping product-specific data shared privately.

This makes review easier and reduces legal/competitive risk.

Do you guarantee acceptance by authorities or notified bodies?

No. Anyone who “guarantees” acceptance is selling vibes. What we can guarantee is process quality: controlled inputs, explicit assumptions, deterministic checks, and audit-grade evidence.

What is your anti-spam and resilience approach for forms?

We use Cloudflare Turnstile where allowed. If the security widget is blocked, we provide two conversion-friendly fallbacks: “Email instead” and “Copy message”.

We avoid tracking and invasive scripts.

What’s the fastest path to a stable UDI operating model?

Define ownership and “single source of truth”, freeze a minimal checkset, establish a deterministic validation gate, and ship a small number of pilot devices end-to-end before scaling.

Scaling without a pilot usually means scaling chaos.

How can we work together without exposing our IP?

We start with structure: acceptance criteria, data contract, and a limited-scope pilot plan. Product-specific data and detailed deliverables are handled in a private engagement and as needed boundaries.

Next step

If you want a concrete implementation plan, start with Start, then check Timelines and References.