[Remote] Data Architect (Databricks, AWS)
Note: The job is a remote job and is open to candidates in USA. Dice is the leading career destination for tech experts at every stage of their careers. Our client, Allwyn Corporation, is seeking a Data Architect to define and design end-to-end data lakehouse architecture leveraging Databricks and AWS, while leading a team of data engineers to build robust data pipelines.
Responsibilities
• Define and design end-to-end data lakehouse architecture leveraging Databricks, Delta Lake, and cloud-native services.
• Create reference architectures for batch, real-time, and streaming data pipelines.
• Architect data ingestion, curation, storage, and governance frameworks.
• Ensure the platform is scalable, secure, and optimized for performance and cost.
• Establish standards for data lineage, metadata management, and compliance.
• Work with enterprise architects to align Databricks solutions with overall cloud strategy (AWS).
• Lead and mentor a team of data engineers in building robust data pipelines.
• Collaborate with data scientists, BI teams, and business stakeholders to enable advanced analytics and AI/ML use cases.
• Drive adoption of DevOps and CI/CD practices for data engineering.
• Review designs and solutions to ensure adherence to architectural principles.
• Build, optimize, and manage large-scale PySpark/SQL pipelines in Databricks.
• Enable real-time data processing through Kafka, Kinesis, or Event Hubs.
• Implement security best practices including RBAC, data masking, and encryption.
Skills
• Define and design end-to-end data lakehouse architecture leveraging Databricks, Delta Lake, and cloud-native services.
• Create reference architectures for batch, real-time, and streaming data pipelines.
• Architect data ingestion, curation, storage, and governance frameworks.
• Ensure the platform is scalable, secure, and optimized for performance and cost.
• Establish standards for data lineage, metadata management, and compliance.
• Work with enterprise architects to align Databricks solutions with overall cloud strategy (AWS).
• Lead and mentor a team of data engineers in building robust data pipelines.
• Collaborate with data scientists, BI teams, and business stakeholders to enable advanced analytics and AI/ML use cases.
• Drive adoption of DevOps and CI/CD practices for data engineering.
• Review designs and solutions to ensure adherence to architectural principles.
• Build, optimize, and manage large-scale PySpark/SQL pipelines in Databricks.
• Enable real-time data processing through Kafka, Kinesis, or Event Hubs.
• Implement security best practices including RBAC, data masking, and encryption.
Company Overview
• Welcome to Jobs via Dice, the go-to destination for discovering the tech jobs you want. It was founded in undefined, and is headquartered in , with a workforce of 0-1 employees. Its website is https://www.dice.com.
Apply tot his job
Apply To this Job