This description is a summary of our understanding of the job description. Click on ‘Apply’ button to find out more.
Role Description
You will own end-to-end technical architecture and delivery of data engineering solutions.
- Design and implement scalable ETL architectures and data pipelines.
- Develop and optimize SQL procedures and ETL processes for batch and real-time data processing.
- Lead a team of data engineers in a matrix organization.
- Provide mentoring in ETL best practices and data engineering methodologies.
- Estimate work effort and assist Project Managers with task planning and resource allocation.
- Collaborate directly with clients on technical requirements and solution design.
- Perform data modeling and schema design for efficient data processing.
- Create robust API integrations for real-time data ingestion from various sources.
- Review code and engineering solutions for performance, scalability, and compliance with standards.
- Create comprehensive technical documentation for data pipelines and integrations.
Outcomes
- Deliver sophisticated data integration solutions on time and within budget.
- Build efficient real-time and batch data processing pipelines.
- Design and implement API-based integration solutions.
- Effectively delegate tasks to development resources both onshore and offshore.
- Develop and mentor data engineers across the organization.
- Strengthen client relationships through high-quality technical solutions.
- Master core data engineering platforms and technologies.
Relationships
This position regularly collaborates with cross-functional teams including:
- Quality Assurance
- Campaign Management
- Business Intelligence
- Information Technology
- Project Management
Key Skills / Experience
- Experience with cloud data platforms (AWS, GCP, Azure) and big data technologies.
- 4+ years of experience with ETL tools (Talend, Informatica, SSIS, or DataStage).
- 4+ years of experience with database technologies (SQL Server, Oracle, or other major RDBMS).
- 2+ years of experience leading data engineering teams.
- Advanced knowledge of ETL architecture, processes, and best practices.
- Experience designing and implementing large-scale data warehouses with both relational and dimensional modeling.
- Expert-level SQL and database programming skills.
- Strong experience designing and implementing real-time data ingestion systems using APIs.
- Proven track record building RESTful and streaming API integrations.
- Proficiency with file transfer protocols and security mechanisms (sFTP, PGP Encryption).
- Experience with automation, workflow orchestration, and scheduling tools.
- Strong knowledge of software engineering practices including version control (Git, SVN), issue tracking (Jira), and SDLC methodologies.
- Scripting/programming skills in Python, Java, Bash, or Perl.
Preferred Skills
- Experience with cloud Data Warehouses (Snowflake, BigQuery, Synapse, Redshift, etc).
- Experience with Customer Data Platforms or Master Data Management systems.
- Expertise with cloud-native ETL services (AWS Glue, Azure Data Factory, GCP Dataflow).
- Experience with streaming data technologies (Kafka, Kinesis).
- Experience with real-time data processing frameworks (Spark Streaming, Flink).
- Experience with data marketing platforms and campaign management systems.
- Knowledge of BI visualization tools (Tableau, Power BI).
- Technical consulting experience.
Requirements
This position is 100% remote and will be reporting to the VP, Data Engineering Lead.
The annual salary range for this position is $94,000 – $152,375. Placement within the salary range is based on a variety of factors, including relevant experience, knowledge, skills, and other factors permitted by law.
Benefits
- Medical, vision, and dental insurance.
- Life insurance.
- Short-term and long-term disability insurance.
- 401k.
- Flexible paid time off.
- At least 15 paid holidays per year.
- Paid sick and safe leave.
- Paid parental leave.
Leave a Reply