
$0-$0 / yr
Salary
argentina
Region
ASAP
Start Date
Zipdev offers the opportunity to work remotely with clients based in the United States. Zipdev recruits and hires the best Developers, Designers, QA Testers, and Project Managers in Latin America. If you have been successful working remotely, work well with remote teams and understand the importance of communication, contact us right away.
We are building a world-class data platform that powers decision-making across the business—from product and marketing to finance and executive leadership. As a Senior Data Engineer, you will be a high-impact individual contributor working across a modern data stack, including Snowflake, dbt, and ELT pipelines.
This is a high-autonomy role for someone who cares deeply about data quality, reliability, and scalability. You will design and operate robust data systems, translate business needs into technical solutions, and help establish best-in-class data engineering practices. If you enjoy building fast, trustworthy, and maintainable data platforms at scale, this role is for you.
Responsibilities
Architect and optimize the Snowflake data platform, including warehouse sizing, cost optimization, storage strategy, and access controls
Design and own dbt project structure, including models, macros, testing, documentation, and scalable data contracts
Build and maintain ELT pipelines using Fivetran and orchestration tools, ensuring reliable data ingestion across multiple sources
Implement and manage data quality and observability frameworks (tests, SLAs, lineage, monitoring, incident response)
Translate business requirements into scalable data models and reusable datasets
Partner with Analytics, Product, and Marketing teams to deliver high-quality, self-service data solutions
Establish and enforce data modeling standards (dimensional and ER models)
Optimize query performance and warehouse costs in Snowflake, providing insights to stakeholders
Define and enforce data governance policies, including RBAC, masking, and PII handling
Own end-to-end delivery of complex data initiatives, from design to production
Participate in code reviews and technical design discussions, raising engineering standards
Identify and reduce technical debt across pipelines, models, and infrastructure
Requirements
5+ years of experience in data engineering building and operating production data pipelines
Deep expertise with Snowflake (architecture, cost optimization, governance, RBAC, masking)
Strong experience with dbt (modeling, testing, macros, project structure)
Hands-on experience with Fivetran or similar ELT tools
Strong knowledge of data modeling (dimensional and entity-relationship)
Experience designing and maintaining scalable, production-grade data pipelines
Ability to translate ambiguous business needs into technical solutions
Strong communication skills and experience working cross-functionally
Bachelor’s degree in Computer Science, Data Engineering, or equivalent experience
Tech Stack
Snowflake
dbt
Fivetran
SQL
ELT pipelines
Data modeling (dimensional & ER)
Nice to Have
Experience with orchestration tools (Airflow, Prefect, Dagster)
Familiarity with data observability tools (Monte Carlo, Atlan, dbt Explorer)
Experience with event streaming systems (Kafka, Kinesis)
Background in high-growth or enterprise environments
Advanced degree in a related field
Benefits
Work remotely Monday - Friday, 40 hours a week (no weekends)
Health Care Reimbursement
Active Lifestyle Reimbursement
Quarterly Home Office Reimbursement
Payroll Deduction Purchase Plans
Continuous Learning Bonus
Access to Training and Professional Development Platforms
Did we mention it's REMOTE?!!
One of our core values at Zipdev is "Be authentic." that's why we encourage you to answer the application form in your own words; we are interested in getting to know you, not a digital assistant.
Wondering how our remote environment or our payment method work? We've put together some helpful answers in our FAQs at the bottom our our career site. Take a look and let us know if you have any other questions!