Skip to main content
B

Python Developer

Building Service 32BJ Benefit Funds

Location

New York, NY

Salary

Not specified

Type

fulltime

Posted

Today

via linkedin

Job Description

Job Code

1541

Department Name

IT Development

Reports To

Team Lead, Data Integration

**FLSA Status

Exempt**

Union Code

N/A

Management

No

About Us:

Building Services 32BJ Benefit Funds (“the Funds”) is the umbrella organization responsible for administering Health, Pension, Retirement Savings, Training, and Legal Services benefits to over 185,000 SEIU 32BJ members. Our mission is to make significant contributions to the lives of our members by providing high quality benefits and services. Through our commitment, we embody five core values: Flexibility, Initiative, Respect, Sustainability, and Teamwork (FIRST). By following our core values, employees are open to different and new ways of doing things, take active steps to improve the organization, create an environment of trust and respect, approach their work with the intent of a positive outcome, and work collaboratively with colleagues.

The Funds oversees and manages $11 billion of dollars in assets, which are made up of many, varied and complex funds. The dollars come from a number of sources, including the property owners who pay into the funds on behalf of their employees, and as such, requires those who oversee and manage the money to be highly skilled financial management people.

32BJ Benefit Funds will continue to drive innovation, equity, and technology insights to further help the lives of our hard-working members and their families. We use cutting edge technology such as: M365, Dynamics 365 CRM, Dynamics 365 F\&O, Azure, AWS, SQL, Snowflake, QlikView, and more.

Please take a moment to watch our video to learn more about our culture and contributions to our members: youtu.be/hYNdMGLn19A

Job Summary:

Under the supervision of the Team Lead, Data Integration, the

Python Developer

is responsible for building secure, scalable data pipelines and integrating data from multiple sources. The role involves deep collaboration with business analysts, other developers, and analytics and data scientists. The ideal candidate will have hands-on experience using

Python APIs

, managing

Python environments

, and implementing

data security practices

such as

secure configuration

and encryption. The role involves working with

Databricks

, modern data platforms, and following solid

SDLC, documentation, and DevOps practices

.

Essential Duties and Responsibilities:

  • Design, develop, and maintain robust Python-based applications and scalable data pipelines using Python
  • Write clean, scalable, and efficient code following best practices
  • Develop and consume REST APIs in Python for data ingestion and integration
  • Configure and manage Python environments (virtual environments, dependency management)
  • Optimize applications for maximum speed and scalability
  • Implement data encryption and security best practices for configuration as well as data in transit and at rest
  • Build and optimize data workflows using Databricks (PySpark)
  • Write, optimize, and maintain T-SQL queries for SQL Server and PostgreSQL
  • Perform ETL/ELT data processing and transformations
  • Support data integration using SSIS and Azure Data Factory
  • Develop well-documented Jupyter/Databricks notebooks and maintain clear technical and process documentation for data pipelines and workflows
  • Follow SDLC best practices throughout development and deployment
  • Use Git / Azure DevOps Git for source code control and CI/CD collaboration
  • Participate in code reviews and troubleshoot data quality or performance issues
  • Perform tasks as required by management/supervisory staff.
  • Provide support after hours, and on weekends as needed.

Qualifications (Competencies):

  • 2\+ years of experience in Python-based Data Engineering
  • Experience working with RESTful APIs in Python
  • Experience configuring and managing Python environments (venv, conda, pip)
  • Hands-on experience with Azure Key Vault for secrets management
  • Knowledge of data encryption and data security fundamentals
  • Strong SQL skills with SQL Server, PostgreSQL, and T-SQL
  • Experience building ETL/ELT pipelines using Databricks (PySpark)
  • Understanding of SDLC, version control (Git), and CI/CD processes
  • Preferred (Nice-to-Have) Skills
  • Experience with SSIS
  • Experience with Azure Data Factory
  • Familiarity with Dremio
  • Exposure to Azure cloud services and AWS
  • Knowledge of data modeling techniques
  • Experience working in Agile/Scrum environments

Education:

Bachelor’s degree in Computer Science, or a related discipline.

Reasoning Ability:

High

Physical Demands:

The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals to perform the essential functions.

The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals to perform the essential functions.

  • Under 1/3 of the time: Standing, Walking, Climbing or Balancing, Stooping, Kneeling, Crouching, or Crawling
  • Over 2/3 of the time: Talking or Hearing
  • 100% of the time: Using Hands

Work Environment:

The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions.

  • 1/3 to 2/3 of the time: Work near moving or mechanical parts, exposure to radiation, moderate noise.

Looking for more opportunities?

Browse thousands of graduate jobs and entry-level positions.

Browse All Jobs