Data Engineer
Job title: Data Engineer in Washington DC at Analytica
Company: Analytica
Job description: Analytica is seeking a remote Data Engineer (MLOps) to support one or more dynamic, long-term federal government enterprise data programs. The ideal candidate will lead the architecture and implementation of on-premises and cloud big data solutions as part of enterprise data modernization.Analytica has been recognized by Inc. Magazine as one of the fastest-growing 250 businesses in the US for 3 years. We work with U.S. government clients in health, civilian, and national security missions to build better technology products that impact our day-to-day lives. The company offers competitive compensation with opportunities for bonuses, employer-paid health care, training and development funds, and 401k match.Responsibilities include (but not limited to):
Expected salary:
Location: Washington DC
Apply for the job now! [ad_2] Apply for this job
Company: Analytica
Job description: Analytica is seeking a remote Data Engineer (MLOps) to support one or more dynamic, long-term federal government enterprise data programs. The ideal candidate will lead the architecture and implementation of on-premises and cloud big data solutions as part of enterprise data modernization.Analytica has been recognized by Inc. Magazine as one of the fastest-growing 250 businesses in the US for 3 years. We work with U.S. government clients in health, civilian, and national security missions to build better technology products that impact our day-to-day lives. The company offers competitive compensation with opportunities for bonuses, employer-paid health care, training and development funds, and 401k match.Responsibilities include (but not limited to):
- Build out data pipelines in AWS for data lakes and data warehouses.
- implementing data pipelines, via a variety of tools including Amazon S3, AWS Glue, Amazon Textract, Amazon Comprehend, AWS Lambda, SQL and/or Python scripts, in the cloud to an existing data lake and data warehouse.
- Implement data pipelines, for batch and streaming data sources, from external feeds into a cloud-based data lake and eventually into data warehouses;
- Implement data cataloging to share metadata information for datasets in the data lake;
- Use AWS serverless components in the data pipeline architecture;
- Use IaC tools to deploy the pipelines within AWS.
- 5+ years of hands-on Data Integration experience creating and maintaining efficient scripts/data pipelines to clean, transform and ingest data from a variety of formats into database tables, data warehouses or data lake repositories.
- Experience building data pipelines using AWS serverless components; using AWS Glue to build, maintain and monitor ETL jobs; using Python to implement ETL scripts and AWS Secrets Manager to manage credentials.
- Experience with AI/ML, NLP, Sentiment Analysis, etc. with one of the leading cloud providers is a plus.
- AWSS ML Certification or AWS Data Engineer Certification is desired.
- Must be US Citizen
- Must be able to obtain and maintain a Public Trust security clearance
Expected salary:
Location: Washington DC
Apply for the job now! [ad_2] Apply for this job