Data Engineer
One Acre Fund
Job Description
Job Description
The Data Engineer builds and maintains the systems that power the organization's data platform. This role is responsible for developing reliable data pipelines, maintaining data infrastructure, and enabling high-quality datasets that support analytics and decision-making across the organization.
Data Engineers work closely with analysts, business stakeholders, and other engineers to ensure that data is not only available, but structured and documented in ways that allow it to be used consistently and reliably. They also own one or more data domains, with end-to-end responsibility for the pipelines, transformations, and underlying data models that support those domains.
Responsibilities
Data Pipeline Development
- Design, build, and maintain data pipelines that ingest and process data from operational systems into the data warehouse.
- Ensure pipelines are reliable, scalable, and maintainable.
- Implement monitoring, logging, and alerting to detect failures or anomalies in data processing.
- Troubleshoot and resolve data pipeline failures or data quality issues.
Data Modeling and Semantic Layer Support
- Work with analysts and data stakeholders to transform raw datasets into structured, reusable datasets that directly support reporting and analytics.
- Implement transformations that capture key business logic within the data warehouse.
- Contribute to the development of curated datasets and standardized views that enable consistent analysis across teams and countries.
- Ensure datasets are clearly documented and understandable to downstream users.
Data Quality and Reliability
- Implement validation, testing, and monitoring processes that ensure data accuracy and consistency.
- Maintain data quality checks and validation rules within pipelines.
- Improve reliability and resilience of the data platform through continuous improvement of infrastructure and processes.
Collaboration with Stakeholders
- Work closely with data analysts, business analysts, and domain experts to understand data needs and operational workflows.
- Translate business requirements into data models and transformations.
- Support the creation of datasets that enable self-service reporting and analytics.
Incident Response and Continuous Improvement
- Investigate and resolve issues affecting data pipelines or datasets.
- Conduct root cause analysis and implement long-term fixes.
- Document systems, datasets, and data pipelines to support knowledge sharing and maintainability.
Qualifications
Technical skills:
- 3+ years experience in data engineering, analytics engineering, or related software development roles
- Strong experience with SQL and data transformation
- Experience with programming languages such as Python or Java
- Experience building and maintaining ETL/ELT pipelines
- Familiarity with modern data warehouse technologies
Knowledge:
- Understanding of data modeling and warehousing concepts
- Experience working with large datasets and analytical systems
- Familiarity with monitoring, logging, and data quality practices