**Experienced Full Stack Data Engineer – Web & Cloud Application Development at blithequark**
Are you a highly skilled and motivated data engineer looking to join a dynamic team at one of the world's leading retailers? Do you have a passion for designing and operationalizing data pipelines to drive business insights and decision-making? If so, we invite you to apply for the Data Engineer – Data Analytics position at blithequark.
**About blithequark**
blithequark is a global leader in the retail industry, with a reputation for providing a family-friendly work environment where employees can thrive and succeed. As a testament to our commitment to our employees, blithequark has been ranked seventh in Forbes' "World's Best Bosses" list. Our company culture is built on a foundation of trust, respect, and open communication, and we are dedicated to fostering a collaborative and inclusive work environment.
**The Role**
As a Data Engineer – Data Analytics at blithequark, you will play a critical role in designing and operationalizing data pipelines to drive business insights and decision-making. You will work closely with cross-functional teams, including data architects, data researchers, and business intelligence architects, to plan and maintain flexible data models and pipelines. Your expertise in programming and SQL will be essential in building and maintaining data pipelines, and your ability to communicate technical ideas to non-technical audiences will be critical in ensuring successful project outcomes.
**Key Responsibilities**
* Design and operationalize data pipelines to make data accessible for use (reports and advanced analysis)
* Gather and process large, complex datasets to meet business needs
* Work closely with data architects to align on data engineering requirements
* Develop and maintain optimal data pipeline design
* Identify, plan, and implement process improvements: automating manual processes, streamlining data delivery
* Develop and maintain large data and NoSQL solutions by creating flexible data handling stages to drive high-value insights to the organization
* Support the development of Data Dictionaries and Data Science categorization for product solutions
* Demonstrate a strong understanding of coding and programming concepts to build data pipelines (e.g., data transformation, data quality, data integration, etc.)
* Assemble data models with Data Modeler and develop data pipelines to store data in defined data models and structures
* Demonstrate a strong understanding of data integration methods and tools (e.g., Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) and database design
* Identify ways to improve data reliability, efficiency, and quality of data management
* Lead ad-hoc data recovery for business reports and dashboards
* Review the integrity of data from various sources
* Manage database design including installing and updating software, and maintaining significant documentation
* Monitor database movement and resource utilization
* Perform peer review for another Data Engineer's work
* Create and operationalize data pipelines to make data accessible for use (BI, Advanced analysis, Services)
* Work closely with data designers and data/BI architects to plan data pipelines and suggest continuous streamlining of data storage, data ingestion, data quality, and organization
* Plan, develop, and implement ETL/ELT processes using IICS (Informatica Cloud)
* Utilize Azure services, such as Azure SQL DW (Neurotransmitter), ADLS, Azure Event Center, Azure Data Factory to improve and accelerate delivery of our data products and services
* Communicate technical ideas to non-technical audiences both in written and verbal form
**Requirements**
* 3+ years' experience designing and operationalizing data pipelines with large and complex datasets
* 3+ years' hands-on experience with Informatica PowerCenter
* 3+ years' experience in Data Modeling, ETL, and Data Warehousing
* 3+ years' hands-on experience with Informatica IICS
* 3+ years' experience working with Cloud technologies, such as ADLS, Azure Data Factory, Azure Databricks, Databricks Live Table, Flash, Azure Neural Network, Universe DB, and other big data technologies
* Broad experience working with various data sources (SQL, Oracle database, flat files (csv, delimited), Web API, XML)
* High-level SQL skills. Strong understanding of social databases and business intelligence; ability to write complex SQL queries against various data sources
* Strong understanding of database management concepts (data lake, social databases, NoSQL, Chart, data warehousing)
* Ability to work in a high-speed agile development environment
* Scheduling flexibility to address business needs including weekends, holidays, and on-call responsibilities on a rotational basis
**Preferred Qualifications**
* BA/BS in Computer Science, Engineering, or equivalent programming/services experience
* Azure Certifications
* Experience executing data integration strategies, such as event/message-based integration (Kafka, Azure Event Center), ETL
* Experience with Git/Azure DevOps
* Experience delivering data solutions through agile software development processes
* Familiarity with the retail industry
* Excellent verbal and written communication skills
* Experience working with SAP integration tools including Bodies
* Experience with UC4 Job Scheduler
**What We Offer**
* Competitive salary: $26/hour
* Comprehensive benefits package
* Opportunities for career growth and professional development
* Collaborative and inclusive work environment
* Flexible scheduling to accommodate business needs
* Recognition and rewards for outstanding performance
**How to Apply**
If you are a motivated and experienced data engineer looking to join a dynamic team at blithequark, please submit your application through our website. We look forward to hearing from you!
Apply for this job