Job Description
About The Company
KeyLogic is a leading provider of innovative technology solutions dedicated to supporting mission-critical initiatives across various sectors. With a focus on data-driven decision making, KeyLogic specializes in developing scalable and reliable data engineering infrastructures that empower organizations to harness the full potential of their data assets. Our commitment to excellence, innovation, and integrity has established us as a trusted partner for government agencies and private sector clients alike. We foster a collaborative environment that encourages continuous learning and professional growth, ensuring our team remains at the forefront of technological advancements.
About The Role
We are seeking a highly skilled Data Engineer to join our remote team and support essential data engineering projects. The ideal candidate will demonstrate the ability to operate independently, lead complex technical assignments, and design robust, scalable data solutions. This role involves managing and optimizing data pipelines, maintaining database performance, and modernizing orchestration and governance frameworks. You will be responsible for working across Oracle and PostgreSQL databases, ensuring data integrity, efficiency, and security. Your expertise will contribute to enhancing our cloud-based analytics capabilities and supporting strategic decision-making processes. This position offers an exciting opportunity to work on high-impact projects within a dynamic and innovative environment, contributing to the modernization of data infrastructure and analytics tools.
Qualifications
- Ability to independently manage Oracle and PostgreSQL databases, including schema design, query optimization, and automation of ETL/ELT pipelines
- Experience with performance tuning techniques such as indexing and vacuuming
- Proficiency in implementing data quality checks and managing database backups and restores
- Proven experience in developing and maintaining robust data pipelines, handling schema evolution, and backfill strategies
- Strong understanding of data modeling, including star and snowflake schemas, and data lake technologies
- Familiarity with schema change management tools such as Liquibase
- Advanced SQL skills and proficiency in Python programming
- Experience working with big data technologies such as Apache Spark, Kafka, and Airflow for orchestration
- Knowledge of cloud infrastructure services, including Databricks, and infrastructure-as-code tools like Terraform and Kubernetes
- Experience with data quality, testing frameworks like Pytest, and data lineage management
- Excellent collaboration skills to translate business requirements into technical solutions
- Ability to create and follow low-risk data maintenance processes
- Understanding of how data supports business operations and the ability to work closely with stakeholders
- Experience with AI-assisted development tools is a plus
- Must be able to obtain a Public Trust clearance (U.S. Citizen or Green Card holder)
Responsibilities
- Lead and execute major technical tasks related to data pipeline development, database management, and performance optimization
- Design, develop, and maintain efficient data schemas and pipelines across Oracle and PostgreSQL environments
- Implement and monitor data quality checks, ensuring data accuracy and reliability
- Optimize database performance through indexing, vacuuming, and query tuning
- Automate ETL/ELT processes to support scalable data ingestion and transformation workflows
- Manage schema changes and data migrations using tools like Liquibase
- Collaborate with stakeholders to understand business needs and translate them into technical data solutions
- Implement cloud-based analytics solutions utilizing Databricks, and manage infrastructure with Terraform and Kubernetes
- Utilize big data tools such as Apache Spark, Kafka, and Airflow for orchestration and processing
- Ensure compliance with data governance standards and maintain data lineage documentation
- Participate in testing, debugging, and troubleshooting data pipelines and database issues
- Develop and document data processes, procedures, and best practices for team and organizational use
- Support data maintenance activities requested by business units with minimal risk
Benefits
- Competitive salary range of $120,000 - $125,000 per year
- Remote work flexibility to support work-life balance
- Opportunities for professional development and continuous learning
- Collaborative and innovative work environment
- Access to cutting-edge technologies and tools in data engineering and cloud computing
- Comprehensive health, dental, and vision insurance options (if applicable)
- Retirement plan options and other financial benefits (if available)
Equal Opportunity
KeyLogic is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate based on race, ethnicity, gender, sexual orientation, age, disability, religion, or any other protected status. All qualified applicants will receive consideration for employment without regard to these factors.
Looking for more opportunities?
Browse thousands of graduate jobs and entry-level positions.