Software Engineer Data and Platform Engineering in Charlotte, North Carolina
About the position
U.S. Bank is seeking the position of Software Engineer - Data and Platform Engineering in Charlotte, North Carolina. The Software Engineer - Data and Platform Engineering is responsible for designing, implementing, and maintaining the CI/CD pipelines and workflows that support deployment, cloud-based applications and services for Customer 360 Unix and Database objects.
Responsibilities
• Design and develop DAGs (Directed Acyclic Graphs) using Apache Airflow to manage workflow orchestration for Customer360 business objects.
• Write custom operators, sensors, and hooks.
• Ensure data quality and implement data validation processes for each unique batch operation.
• Develop batch and real-time scheduler to automate processing of Transaction and Operational data.
• Develop programs to test, simulate, and design new pieces of automation software using Python or Scala.
• Transition Analytics and Customer 360 applications to a cloud-first approach using Microsoft Azure.
• Apply Dev/Ops mindset and take ownership of Customer 360 production success.
• Develop data storage solutions and integrate databases using Azure data services.
• Design and develop machine learning algorithms using AI/ML and deep learning applications.
• Drive the release planning and execution for successful deployments of Customer 360 operational objects.
• Develop high-quality code and define best engineering practices.
• Document engineering artifacts such as technical design documents and flowcharts.
• Collaborate cross-functionally with product owners, data scientists, and other engineers.
Requirements
• Master's degree in Computer Science or Technology Management plus 3 years of experience as a Software Engineer or Data Engineer.
• Bachelor's degree in Computer Science or Technology Management plus 5 years of experience as a Software Engineer or Data Engineer in lieu of a Master's degree plus 3 years of experience.
• 3 years of experience with Master's or 5 years of experience with Bachelor's in building data pipelines, extracting data from sources, performing data conversion or transformation, and building or developing ETL processes.
• Experience with cloud platforms (Azure or Google Cloud), Python, Spark, Spark SQL, Teradata, Hadoop, HDFS files, .xml files, data models, Oracle, Informatica, Parquet, and Json.
Benefits
• Healthcare (medical, dental, vision)
• Basic term and optional term life insurance
• Short-term and long-term disability
• Pregnancy disability and parental leave
• 401(k) and employer-funded retirement plan
• Paid vacation (from two to five weeks depending on salary grade and tenure)
• Up to 11 paid holiday opportunities
• Adoption assistance
• Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year.
Apply tot his job
Apply To this Job