Job BriefWe are hiring a
Associate Data Engineer/ Data Engineer to provide critical support for the migration of Talend-based ETL pipelines to a modern data stack involving Python, dbt, and Airflow or Dagster. The role is primarily responsible for assisting with development, testing, monitoring, and documentation tasks during the migration and operational support phases.
VentureDive OverviewFounded in 2012 by veteran technology entrepreneurs from MIT and Stanford, VentureDive is the fastest-growing technology company in the region that develops and invests in products and solutions that simplify and improve the lives of people worldwide. We aspire to create a technology organization and an entrepreneurial ecosystem in the region that is recognized as second to none in the world.
Key Responsibilities:
Migration Support- Assist in converting and validating Talend ETL workflows into Python scripts, dbt models, and orchestrated DAGs in Airflow or Dagster.
- Follow guidance and standards defined by the lead engineer to ensure consistency and reliability.
Data Pipeline Development- Develop and test data ingestion scripts and connectors using Kafka, NiFi, and REST APIs.
- Support data transformation logic in dbt, including tests, snapshots, and documentation.
Monitoring & Maintenance- Monitor existing ETL pipelines (legacy and modern) for failures or performance issues.
- Resolve basic operational issues and escalate complex problems to senior team members.
Data Handling- Work with data stored in Parquet, Delta Lake, and Iceberg formats across distributed file systems or object storage.
- Perform schema checks, data quality validations, and row-level transformations as needed.
Documentation & Collaboration- Maintain clear and structured documentation of migration work, process changes, and data mappings.
- Collaborate with QA, data analysts, and platform teams to validate and deploy pipelines in dev/test/prod environments.
Required Qualifications & Experience:- Bachelor’s degree in Computer Science, Data Engineering, or related field.
- 1–3 years of hands-on experience in ETL development or data engineering.
- Strong foundation in SQL and Python (data handling, file operations, API usage).
- Familiarity with one or more of the following tools:
- dbt for transformation
- Airflow or Dagster for orchestration
- Kafka, NiFi, or other data ingestion tools - Understanding of data formats like CSV, JSON, Parquet, and storage systems.
- Good communication and teamwork skills with an eagerness to learn.
Preferred Qualifications:- Exposure to modern data platforms (e.g., Databricks, BigQuery, S3, HDFS).
- Prior involvement in migration or refactoring projects is a plus.
- Experience working in Agile/Scrum teams and using Git for version control.
What we look for beyond required skills
In order to thrive at VentureDive, you
…are intellectually smart and curious
…have the passion for and take pride in your work
…deeply believe in VentureDive’s mission, vision, and values
…have a no-frills attitude
…are a collaborative team player
…are ethical and honest
Are you ready to put your ideas into products and solutions that will be used by millions?
You will find VentureDive to be a quick pace, high standards, fun and a rewarding place to work at. Not only will your work reach millions of users world-wide, you will also be rewarded with competitive salaries and benefits. If you think you have what it takes to be a VenDian, come join us ... we're having a ball!
#LI-Hybrid