Senior Data Architect & Engineer (Azure Databricks Platform)
Company: Corteva, Inc.
Location: Des Moines
Posted on: May 18, 2025
|
|
Job Description:
Senior Data Architect & Engineer (Azure Databricks Platform)
page is loadedSenior Data Architect & Engineer (Azure Databricks
Platform)Apply locations Remote (Iowa) Des Moines, Iowa, United
States time type Full time posted on Posted 2 Days Ago job
requisition id 240795WWho are we, and what do we do?Corteva
Agriscienceis the only major agriscience company in the world
completely dedicated to agriculture. Our purpose is to enrich the
lives of those who produce and those who consume, ensuring progress
for generation to come. Our inspiration is to be an innovator,
driving the next generation of agriculture products that help farms
and farmers flourish and through partnering with society becoming
the most trusted partner in the global agriculture and food
community.We are seeking a highly skilled Data Architect & Engineer
to lead the design, development, and implementation of scalable
data models and pipelines in Azure Databricks. This hybrid role
bridges architecture and engineering, and is instrumental in
building a high-performance enterprise data lakehouse supporting
commercial, production, and finance domains. The platform will
serve as the foundation for data-driven decisions, advanced
analytics, and AI model development across the organization.What
You'll Do:Architecture & Data Modeling- Design scalable and
maintainable data models across commercial, production, and finance
domains.- Define and enforce enterprise-wide data architecture
standards, naming conventions, and data modeling best practices.-
Collaborate with domain experts, analysts, and business leaders to
translate data requirements into logical models.Engineering &
Implementation- Build and optimize data pipelines in Databricks
using PySpark, SQL, Delta Lake, and delta live tables.- Implement
data transformation logic (ELT) to curate clean, trusted, and
high-performance data layers.- Develop data products using Unity
Catalog, Alation, data asset bundle and Gitlab CI/CD workflows.-
Ensure query optimization, data quality, and high availability of
data pipelines.Platform Management- Manage and orchestrate
workflows using Databricks Workflows, Azure Data Factory, or
equivalent tools.- Integrate structured and unstructured data from
diverse sources (e.g., ERP, CRM, IoT, APIs) into the lakehouse.-
Ensure platform security, governance, and compliance using Unity
Catalog, RBAC, and lineage tools.What Skills You Need:- 5+ years of
experience in data engineering, data architecture, or enterprise
analytics roles.- Strong knowledge of modern enterprise data
architectures, (including data warehouses, data lakehouses, data
fabric, and data mesh,) with an understanding of their trade-offs.-
Hands-on experience with Databricks on Azure, including Delta Lake
table and Unity Catalog.- Proven expertise in data modeling
(dimensional) and pipeline development (batch, stream) for
cross-functional enterprise data.-Proven experience with big data
environments, familiar with modern data formats (e.g., Parquet,
Avro) and open table formats (e.g., Delta Lake, Apache Iceberg-
Proficient in SQL, PySpark, dbt and Kafkafor data engineering and
transformation workflows.- Deep understanding of Azure ecosystem
(e.g., ADLS Gen2, Synapse, ADF, Key Vault).- Experience with
version control and CI/CD practices for data projects.Preferred
Qualifications:- Background in data integration for commercial,
operations, and financial domains.- Knowledge of data governance,
observability, and cataloging tools (e.g., Alation).- Experience
optimizing distributed data processing and cost/performance
tradeoffs.- Familiarity with regulatory compliance in enterprise
data environments.Soft Skills:- Excellent collaboration and
communication skills across business and technical teams.-
Comfortable in agile, fast-paced, and highly accountable
environments.- Able to translate complex data problems into
practical, scalable solutions.Emerging Technologies & Practices:-
Apply modern architectural patterns such as Autonomous Data
Products to create self-contained, discoverable, and reusable data
components with defined ownership and SLAs.- Drive adoption of data
contracts, data observability, and lineage tracing to enhance
reliability and governance across data domains.- Evaluate and
implement Lakehouse federation patterns and data mesh principles
for scaling across global teams and business units.- Champion the
integration of semantic layers, feature stores, and time-travel
auditing to support both business intelligence and machine learning
use cases.#LI-BB1Benefits - How We'll Support You:
Are you a good match? Apply today! We seek applicants from all
backgrounds to ensure we get the best, most creative talent on our
team.The salary range for this position is $126,610.00 to
$158,260.00.This reflects a reasonable estimate of the targeted
base salary for this role. This role is also eligible for an annual
bonus. Based on factors such as geographic location and candidate
qualifications, actual base pay is determined when an employment
offer is made.Corteva Agriscience is an equal opportunity employer.
We are committed to embracing our differences to enrich lives,
advance innovation, and boost company performance. Qualified
applicants will be considered without regard to race, color,
religion, creed, sex, sexual orientation, gender identity, marital
status, national origin, age, military or veteran status, pregnancy
related conditions (including pregnancy, childbirth, or related
medical conditions), disability or any other protected status in
accordance with federal, state, or local laws.Similar Jobs
(2)Senior Optimization Scientistlocations 2 Locations time type
Full time posted on Posted 22 Days AgoSenior Data
Scientistlocations 2 Locations time type Full time posted on Posted
2 Days Ago
#J-18808-Ljbffr
Keywords: Corteva, Inc., Ankeny , Senior Data Architect & Engineer (Azure Databricks Platform), Engineering , Des Moines, Iowa
Click
here to apply!
|