Skip to main content
E

Data Engineer

EcoMetricx

Location

Remote

Salary

Not specified

Type

fulltime

Posted

Today

via linkedin

Job Description

Company Description

EcoMetricx is a data science and analytics firm building modern software with a focus on government and regulated industry. We are profitable, debt-free, and growing, and we operate as a small senior team rather than a layered organization. Our clients include municipal governments, utilities, and private-sector firms across a range of industries, and our engagements range from full platform deployments to focused analytical and AI work.

Because the team is small, the work you ship has immediate impact — both on the platform itself and on the agencies, utilities, and businesses that depend on it. If that kind of ownership is what you are looking for in your next role, we would like to hear from you.

Job Description

EcoMetricx is hiring a Data Engineer to build the pipelines, integrations, and cloud infrastructure behind our cutting-edge asset and case management platform. This is a hands-on engineering role on a small, senior team that delivers production software to government agencies, utilities, and private-sector clients across the country.

You will write Python every week, ship code to production on a regular cadence, and engage directly with client IT teams to design integrations and work through technical decisions in real time. There are no layers of account management between you and the work, and no handoff to someone else once a feature ships. The systems you build are the systems you operate.

WHAT YOU'LL WORK ON

Designing and operating Python data pipelines on AWS, using Lambda and Glue as the primary compute and orchestration tools. You will own these pipelines from initial design through production operation, including monitoring, error handling, and the eventual evolution of the pipeline as client needs change.

Building and maintaining integrations with a wide variety of external systems, including enterprise resource planning platforms, productivity suites, payment processors, and government and regulatory APIs. Each engagement brings a different mix of integration targets, so this work rewards engineers who enjoy learning new systems quickly and designing interfaces that hold up under change.

Standing up and operating AWS environments end-to-end. This includes provisioning infrastructure as code, building and maintaining CI/CD pipelines, configuring monitoring and alerting, and operating within SOC 2-aligned security controls — all of which you will own rather than inherit from a separate platform team.

Contributing to our PostgreSQL data model and our FastAPI backend. You will participate in schema design decisions, write API endpoints that other systems depend on, and help us evolve the platform as it scales across additional clients and use cases.

WHAT YOU BRING

At least four years of professional experience building production data pipelines, system integrations, or backend services. We are not looking for a particular résumé shape — backend engineers, data engineers, and integration engineers with the right skill mix are all good candidates.

Strong working proficiency in Python and SQL. You should be comfortable shipping production code without scaffolding, designing schemas against real-world data, and writing queries that perform well at scale.

Hands-on production experience with AWS, including direct experience building and operating workloads on Lambda and Glue. Familiarity with the broader AWS ecosystem (S3, RDS, IAM, VPC, CloudWatch, and similar core services) is expected.

Practical experience designing and consuming REST APIs in production environments, including modern authentication patterns such as OAuth 2\.0 and SAML. You should be comfortable reading and writing OpenAPI specifications and reasoning about API contracts.

Day-to-day fluency with infrastructure as code, using Terraform, CloudFormation, or a comparable tool. We expect infrastructure changes to go through code review like any other change.

Comfort with Git, modern CI/CD platforms, and code review as part of your normal workflow. You should be at home in a small team where every change is reviewed and every deploy is automated.

Clear written communication. A meaningful share of the role is producing documentation that clients rely on to operate, audit, and extend the systems we deliver, so writing well matters.

NICE TO HAVE

Production experience with Google Cloud Platform

Prior delivery experience in the public sector, in regulated utilities, or in another regulated industry where compliance and auditability shape engineering decisions.

Production experience with FastAPI or with Elasticsearch (or comparable search infrastructure) at meaningful scale.

Direct exposure to SOC 2, ISO 27001, FedRAMP, or RMF/ATO compliance frameworks, including familiarity with the engineering controls those frameworks require.

Familiarity with geospatial data and tooling, such as the ESRI ArcGIS REST API or PostGIS, for engagements that involve location-aware data.

DETAILS

This is a remote position based in the United States, with occasional travel to client sites as engagements require.

Full-time, salaried role with a base salary range of $120,000 to $150,000 depending on experience, plus benefits. Specific compensation is determined based on the candidate's background and is discussed in the initial conversation.

U.S. citizenship is required for this role due to the nature of our public-sector engagements

We review every application and expect an email from the team if you are selected to be a part of the hiring process.

Looking for more opportunities?

Browse thousands of graduate jobs and entry-level positions.

Browse All Jobs