3. Orchestration pipelines and deployment automation framework
You no longer need to be an Apache Airflow expert to harness its power. Orchestration pipelines are a core component of our new cross-product Deployment Automation Framework, allowing you to create end-to-end data pipelines efficiently.
-
Declarative orchestration: Define your entire pipeline—including the orchestration logic, infrastructure configuration, and dependencies—in simple, human-readable YAML files.
-
Cross-product bundles: These YAML definitions are easily deployed as a complete bundle to the cloud. For example, without knowing Airflow syntax, a user can quickly create and deploy a comprehensive data integration pipeline across dbt, Spark, DTS, and more.
-
Unified IDE experience: Alongside automated validation and deployment via GitHub actions, the Google Data Cloud extension makes agentic authoring and troubleshooting the centerpiece of your workflow. You can now rely on powerful AI agents to build and debug pipelines directly in your IDE, with the ability to visually inspect the agent-generated DAGs for complete oversight.
Crucially, this declarative approach breaks down the traditional silos between advanced Python developers and data analysts. By shifting to human-readable YAML, we are fostering a more inclusive data culture where a wider range of practitioners can independently author, understand, and manage critical data workflows.
4. MCP Server for Managed Airflow (Public Preview)
To further bridge the gap between AI and orchestration, we are launching the Managed Airflow MCP Server in Public Preview.
-
Agentic tooling: This server provides tools like list_environments, get_dag_run, and get_task_instance to fetch critical information about your environments.
-
Seamless integration & reduced context-switching: Both humans and agents can use these tools to simplify task management. Most importantly, this drastically reduces the context-switching developers face when debugging complex DAGs. By bringing environment and task data directly into your preferred interfaces, you can troubleshoot faster without constantly pivoting between different consoles.
Embrace the future of data orchestration
With these launches, we are fundamentally lowering the barrier to entry for orchestration while simultaneously raising the ceiling for what power users can achieve. By taking away the infrastructure burden and providing native, agentic tooling, data teams can stop wrestling with boilerplate code and start focusing primarily on deriving insights and driving business value.
Whether you are a seasoned Data Engineer building dynamic Python DAGs or a Data Analyst defining straightforward YAML pipelines, Managed Service for Apache Airflow is built for you.
Get Started Today Ready to experience the next generation of data pipeline orchestration? Create a new environment in the Google Cloud Console, explore the Google Cloud Data Agent Kit extension, and start building your agentic future today.
1. Availability might be limited (details)




