DevOps & Data Engineer

  • Full Time
  • Zambia
  • Applications have closed

Private International Tech Company

Will be based in Zambia

We are looking for a highly skilled Data Analyst & Engineer with DevOps expertise to join our team. This hybrid role requires a unique combination of data analysis, engineering, and DevOps skills. The ideal candidate will have experience analyzing complex datasets, designing and maintaining data pipelines, and managing cloud infrastructure in support of data-driven initiatives. This role plays a crucial part in ensuring the efficient and secure flow of data from collection to actionable insights.

Key Responsibilities:

Data Analysis & Insights:
Analyze large datasets to uncover insights and trends that inform business decisions.
Build, maintain, and improve data models for various business processes.
Develop reports, dashboards, and visualizations to present data insights using tools like Power BI, Tableau, MixPanel, Kibana or similar.
Collaborate with cross-functional teams to understand data needs and provide data-driven solutions.
Data Engineering:
Design, implement, and maintain scalable data pipelines to ensure efficient data flow across various systems.
Ensure data quality and consistency by developing robust ETL, DMS processes.
Work with databases, data warehouses, and data lakes (e.g., AWS Aurora, Snowflake) to store, retrieve, and process data.
Automate data workflows, data integration, and processing tasks.
DevOps for Data:
Manage and monitor cloud infrastructure (AWS, Azure, or GCP) for data solutions, ensuring high availability and scalability.
Build and maintain CI/CD pipelines for data operations, ensuring rapid, error-free deployments.
Implement automation tools for data pipeline orchestration.
Ensure security best practices are followed, including data encryption and access management.
Collaboration & Communication:
Work closely with data scientists, software engineers, product managers, and business stakeholders to align on data requirements and objectives.
Act as a bridge between the data and DevOps teams, ensuring smooth integration of data solutions into the broader technology infrastructure.
Qualifications:

Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Information Technology, or a related field.
Experience:
Proven experience as a Data Analyst, Data Engineer, or in a similar role with a solid understanding of data architecture and infrastructure.
Hands-on experience with DevOps tools and practices, especially in cloud environments (AWS, GCP, or Azure).
Experience working with ETL/ELT pipelines, data warehousing, and big data processing frameworks
Strong SQL skills and experience working with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB).
Experience with CI/CD pipelines, version control (Git), and infrastructure as code (e.g., Terraform, CloudFormation).
Familiarity with containerization and orchestration tools like Docker and Kubernetes.
Technical Skills:
Proficiency with Python, SQL, and data processing frameworks (e.g., Apache Spark, Pandas).
Familiarity with data visualization tools like Power BI, Tableau, or similar.
Experience with cloud services such as AWS), GCP (BigQuery, Cloud Storage), or Azure.
Knowledge of automation tools (e.g., Airflow, Jenkins) and monitoring/logging tools (e.g., Prometheus, Grafana).
Soft Skills:
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills to work with cross-functional teams.
Ability to work in a fast-paced, agile environment.
Preferred Qualifications:

Experience with machine learning pipelines and integrating them into production environments.
Familiarity with big data technologies such as Kafka, Hadoop, or Spark.
Knowledge of security best practices in data engineering and cloud-based DevOps environments.
Certifications in cloud platforms (AWS Certified Data Analytics,or Google Cloud Professional Data Engineer).
Benefits:

Competitive salary and benefits package.
Opportunity to work with cutting-edge technologies in data analytics and DevOps.
Growth opportunities within a collaborative and innovative work environment.
Flexible working hours and remote work options.

Share with friends:

* Legitimate employers do not ask for payment. Find out more at safe job search tips.