The first time a data team wires up Azure Data Factory to Jetty, someone mutters, “Wait, which part handles the auth?” It is a fair question. The integration looks simple on paper, but behind the scenes it juggles pipelines, credentials, and service boundaries that all need to trust each other without leaking secrets.
Azure Data Factory is the workhorse that moves and transforms data across cloud and on‑prem systems. Jetty, in this context, often powers lightweight web services that expose data endpoints or APIs your pipelines depend on. Connecting the two cleanly matters. You want Data Factory orchestrating jobs to your Jetty‑based endpoints with stable identity, strong encryption, and zero guesswork about who can call what.
The pattern is straightforward. Azure Data Factory uses managed identities or linked services to authenticate outbound requests. Your Jetty service, running behind something like Azure App Service, verifies those tokens using OpenID Connect or OAuth 2.0. The handshake ensures each data movement or transformation request carries a verifiable identity, not a static key lost in a configuration file. Once that’s done, Jetty handles the request, logs it, and hands back results for downstream actions.
If the pipeline fails at that handoff, the usual culprit is role mapping. Check that the Jetty endpoint knows which Azure role or group represents valid callers. You can tighten this further with RBAC policies that align to least privilege. Rotate credentials periodically, even for managed identities; compliance standards like SOC 2 look for proof that rotation isn’t theoretical.
Quick featured answer: You connect Azure Data Factory to Jetty by assigning a managed identity to the factory, exposing your Jetty service with a verified OIDC endpoint, and authorizing that identity through Jetty’s security constraints. This creates a token‑based, auditable connection instead of a manual credential path.