### **About Scalepex**
Scalepex is a nearshore partner that empowers businesses to scale efficiently and effectively through exceptional services. With a strong focus on customer success, innovation, and cultural alignment, we support companies in building high-performing teams across industries.
At Scalepex, we believe in fostering an environment where our people can thrive, grow their careers, and make a real impact.
### **Job Summary**
We are looking for a **Data Engineer** to join our DataOps team, focused on delivering high-quality **data pipelines, transformations, and integrations** for healthcare-related data.
This role is highly **hands-on with SQL, data processing, and troubleshooting**, working closely with internal teams and external partners to ensure **data accuracy, scalability, and performance**.
**Requirements**
### **Key Responsibilities**
* Extracting, cleansing, and loading data.
* Building data pipelines using SQL, Kafka, and other technologies.
* Investigate and troubleshoot issues with data and data pipelines.
* Investigation and documentation of new data sets
* Implement new data transformation configurations or ad-hoc reports
* Triage incoming bugs and incidents.
* Participation in sprint refinement, planning, and kick-off to help estimate stories, raise awareness and define additional implementation details.
* Monitor data pipeline execution and performance. Resolve discovered issues collaboratively.
* Perform and implement data quality checks to maintain consistent and accurate data.
### **Required Qualifications**
* Solid grasp of modern relational and non-relational models and differences between them.
* Expert in writing complex SQL, including pivots, window functions, and complex date calculations
* Expert in analyzing and parsing flat file data structures.
* Proficient in EDI X12 formats related to healthcare claims processing
* Proficient in the use of Excel and familiar with analytical tools such as Tableau, MicroStrategy, PowerBI, etc…
* Familiarity with API usage for healthcare data integration including data parsing and conversion.
* Familiarity with EFT file movement principles and tools including encryption.
* Ability to translate data movement and transformation concepts and details from existing to new tech stacks.
* Ability to clearly communicate data-related concepts and details with internal colleagues, and external data partners and vendors.
* Detail-oriented and able to examine data and code for quality and accuracy.
* 2+ years of experience in data engineering
### **Preferred Qualifications**
* ETL experience preferred
* Git experience preferred
* SW certification or degree in IT or Computer Science related field
**Benefits**
* Contractor scheme
* 100% Remote