Skip to main content
B

ETL Developer

Brooksource

Location

Charlotte, NC

Salary

Not specified

Type

fulltime

Posted

Today

via linkedin

Job Description

Job Description: ETL / Data Integration Developer – Data Activation Team

Location:

Charlotte, NC Onsite 4 days WFH 1

Duration:

End of year possible extension

Client:

Fortune 100 Telecommunications Company

Partner:

Brooksource

Overview

Brooksource is seeking a

Data Integration Developer

to join the Data Activation team within our Fortune 100 Telecommunications client’s Advertising \& Media organization. This team builds and maintains the data pipelines that prepare and activate customer audiences, enabling large‑scale marketing and advertising use cases.

This is a hands‑on development role focused on building, optimizing, and troubleshooting data pipelines across AWS, Airflow, and Snowflake. No leadership or management responsibilities are required.

Experience Requirements

3–5 years of professional experience

in:

  • SQL development
  • Python scripting
  • ETL/ELT development
  • Airflow or similar orchestration tools
  • Cloud data platforms (Snowflake preferred)
  • MPP systems such as Teradata, Oracle, or Informatica

Key Responsibilities

  • Develop and maintain

ETL/ELT workflows

supporting audience activation, segmentation, and downstream marketing data.

  • Build and enhance

Airflow DAGs

that move data from AWS into Snowflake.

  • Write and optimize SQL for transformations and activation‑ready datasets.
  • Support data ingestion and processing within

MPP environments

like Teradata, Oracle, or Informatica.

  • Perform data validation, reconciliation, and quality checks on activation datasets.
  • Maintain automation scripts and reusable components for CI/CD pipelines.
  • Collaborate with Product Owners, Architects, and other developers within an Agile Scrum team.
  • Troubleshoot and resolve pipeline failures, improve performance, and increase reliability.

Required Skills

  • Strong experience with

SQL

  • Hands‑on

Python

development

  • Experience developing workflows in

Airflow

  • Working knowledge of

Snowflake

(queries, transformations, modeling)

  • Experience with

MPP systems

like Teradata, Oracle, or Informatica

  • Strong understanding of

ETL/ELT concepts

  • Exposure to

AWS

or other cloud platforms

  • Unix/Linux commands or scripting familiarity
  • Experience with

data validation and quality checks

Nice-to-Have Skills

  • Experience with marketing/advertising data, audience activation, or CDP workflows
  • AWS tools such as S3, Lambda, Glue, or Step Functions
  • Experience with customer segmentation or media measurement data

Looking for more opportunities?

Browse thousands of graduate jobs and entry-level positions.

Browse All Jobs