1,648 Data Engineer jobs in SouthAfrica
Data Engineer
Posted today
Job Viewed
Job Descriptions
A position has become available for a Data Engineer - SAP BW to join our dynamic team.
The purpose of the Data Engineer is to develop cutting-edge reporting solutions within our data landscape. You’ll gather requirements, design, develop, and test SAP BW objects, and build end-to-end BI solutions. Your expertise in data modelling, HANA integration, and SAP modules will be crucial.
KEY PERFORMANCE AREAS
- Designs, develops, and tests SAP BW objects including data sources, Advanced DSOs, Composite Providers, transformations, DTP’s and queries to meet technical specifications, business requirements and user-signoff.
- Models the SAP BW system with the LSA methodology.
- Develops HANA Calculation Views with integration to SAP BW.
- Designs and builds end-to-end BI solutions that will extract data from multiple systems.
- Develops proof-of-concept prototypes with fast iteration and experimentation.
- Works on projects to integrate data from various internal systems and cloud platforms, including work on all stages of the project through analysis, planning, testing, and release phases.
- Develops and maintains design documentation, test cases, performance and monitoring and performance evaluation.
- Supports existing data warehouse process chains and associated models, queries, CDS Views etc.
- Coordinates, supports, and tests for regular maintenance routines - service pack upgrades, storage migration, site switches etc.
- Works closely with business customers, Business Analysts, and team members to understand the business requirements that drive the analysis and design of quality technical solutions.
- Ad hoc duties as required by management.
MINIMUM REQUIREMENTS
- Bachelor’s Degree in Computer Science, Data & Analytics Management, Supply Chain, Information Services, or related discipline.
- At least 5 - 7 years' experience with SAP Business Warehouse (BW) is required.
- Strong understanding of Business Intelligence and Data Warehousing concepts.
- Extensive knowledge in data modelling, particularly based on the BW/4HANA concept.
- Experience in SAP BW Modelling, Extraction and Report Design, Query Designer and SAP Business Objects tool.
- Knowledge in BW related ABAP coding (Badi enhancements, start/end/expert routines, CMOD Exit for custom variables).
- Strong working knowledge of SAP modules including Finance, Supply Chain, Sales & Distribution and Plant Maintenance.
- Experience working with and coordinating multi-business stakeholders and business owners in a retail environment.
IDEAL APPLICANT SKILLS
- A high degree of confidentiality, emotional maturity, ethical values and integrity.
- Must have a high stress tolerance and the ability to work efficiently under pressure.
- Excellent communication, presentation, and interpersonal skills, to interface effectively with all levels of colleagues and with external customers in a team oriented manner.
- An innovative thinker and ability to solve complex problems.
- Must be collaborative, influential, and rational.
- Precise planning and excellent organisational/administration skills.
- Exceptional accuracy and extreme attention to detail.
- Quality orientated and self-driven to achieve results.
Data Engineer
Posted today
Job Viewed
Job Descriptions
Imagine being a data and programming guru and understanding the intricacies of banking and insurance. At Monocle, we believe work is more than a job. It is a chance to make a difference in the world of finance. An opportunity to fulfil your potential. To make friendships with ambitious, positive, likeminded people. To gain invaluable industry experience and learn critical skills that will make you a highly sought-after management consultant. Monocle strives to provide our clients with a cradle-tograve implementation approach. This means that we not only propose fit-for-purpose solutions, but we also execute the development and implementation of the technical infrastructure required to realise the full benefit of these solutions.
Monocle is a people-centric, independent consulting firm, specialising in banking and insurance. We understand that our greatest asset is our people and because of this, we do things a bit differently.
Monocle embraces an ongoing and active approach to amplifying the talents of our people, empowering them to unlock their full potential and feel fulfilled in their careers. Our vibrant culture thrives through our shared growth mindset, collaboration, support, and social connection, fostering a space where everyone can flourish. We are committed to continuous learning and provide ongoing technical training, industry certification support, and personal development opportunities through mentorship and executive engagement. Ultimately, we strive to create an environment that nurtures ambition, supports well-being, embraces diversity, upholds integrity, and inspires individuals to make a positive and lasting impact on the world around us.
Monocle Offers:- Unparalleled growth and exposure – Monocle is uniquely positioned in the market to undertake projects across a wide spectrum of critical and exciting areas of the financial services industry. Our managers deliver mission critical projects at the most prestigious banks and insurers in Johannesburg, Cape Town, London and Amsterdam.
- Unlimited training and development – Investment in our people’s development is at the heart of Monocle’s company ethos. That is why we prioritise the upskilling of every employee.
- Unique and vibrant company culture – At Monocle, we believe friends work better together than colleagues. We love nothing more than partaking in a wide variety of activities through our company sponsored clubs.
- Ultra-competitive compensation – At Monocle, we want the best talent to join our team, so we understand that those individuals need to be recognised and rewarded for their true value.
Design and implement scalable data pipelines using Cloud services such as Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric & Databricks.
Develop and maintain ETLprocesses to transform and integrate data from various sources.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
Optimize and tune performance of data pipelines and queries.
Ensure data quality and integrity through robust testing and validation processes.
Implement data security and compliance best practices.
Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Stay updated with the latest developments in Data Engineering technologies and best practices.
The successful candidate must have comprehensive experience in the above, and must also meet the following requirements:Holds a Bachelor’s degree from an accredited university.
Industry experience: A minimum of two years of hands-on experience is required. Prior experience in the financial services industry would be beneficial but not mandatory.
Strong foundation in data engineering: We value hands-on experience and proven skills in building and managing data solutions using on-premises technologies or Cloud. While a Bachelor's degree in Computer Science, Engineering, or a related field is a plus, your ability to demonstrate expertise matters most.
Experience with core Cloud Data Services: Familiarity with Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric or Databricks.
Experience with Big Data technologies: Knowledge of big data technologies such as Apache Spark, Hadoop, or Kafka.
Scripting & Programming proficiency: Programming skills in Python, Pandas & SQL
Database Management: Experience working with relational databases like AWS RDS, MS SQL, Azure SQL DB or Postgres.
Solid Data Engineering background: Knowledge and experience of data modelling, ETL processes, and data warehousing.
Infrastructure as code (IaC) proficiency: Experience with tools like AWS CloudFormation, Terraform or Azure ARM/Bicep for automating infrastructure provisioning and deployments is crucial.
DevOps fluency: We seek a candidate with experience in CI/CD tools to streamline software development and delivery.
Communication and collaboration: Excellent communication, problem-solving, and analytical skills are key. The ability to present complex technical concepts in a clear and concise way.
Cloud Certification (a plus): While not mandatory, possessing a relevant Cloud certification demonstrates your commitment to professional development and validates your understanding of Cloud services and best practices.
The Ideal Monocle Data Engineer:Strong problem-solving and analytical skills.
Excellent communication and collaboration skills.
Attention to detail and a commitment to data accuracy.
Be eager to learn new approaches and technologies.
Works well under pressure to meet client objectives.
Is sociable and enjoys interacting with others, both at work and at social events.
Works well independently and in a team.
Recruitment Process:During the interview process, you will meet with both People and Culture representatives and Monocle Executives. A critical aspect of the interview process will be determining whether Monocle will be a good fit for you and your career aspirations, so please come prepared with any questions you may have.
If you have a passion for working with people, a relevant qualification, and think you have what it takes to join a team of your smartest and most driven peers, apply to Monocle today!
A 45-Minute, 60-questions, online competency assessment test.
P&C Meet and GreetA personal meeting with People & Culture to get to know you and tell you more about Monocle.
Executive interviewTo discuss your skills, business knowledge, experience, project fit and the role requirements.
C-level InterviewA discussion with either our CEO, COO or CFO on various topics.
A final offer is made by the P&C team to the candidate.
Take your career to the next level at a dynamic and innovative management consulting firm.*Please note that on clicking 'Apply Now' you will be redirected to our third-party career portal.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Descriptions
Join to apply for the Data Engineer role at ExecutivePlacements.com - The JOB Portal
3 days ago Be among the first 25 applicants
Join to apply for the Data Engineer role at ExecutivePlacements.com - The JOB Portal
SUMMARY:
We are looking for a Data Engineer to join our team in Cape Town, on an initial 12-month contract, o
Data Engineer
Recruiter:
Datonomy Solutions
Job Ref:
713
Date posted:
Monday, May 12, 2025
Location:
Cape Town, South Africa
Salary:
R60k - 75k per month
SUMMARY:
We are looking for a Data Engineer to join our team in Cape Town, on an initial 12-month contract, o
POSITION INFO:
We are looking for a Data Engineer to join our team in Cape Town, on an initial 12-month contract, onsite.
The Data Engineer will work within the Business Intelligence team of the information technology division and will be responsible to work closely with our Financial services division for the design, development and implementation of scalable and robust processes, in accordance with specified guidelines.
This role includes extracting, transforming, and consolidating data, developing and maintaining a data platform that will ensure optimal performance and stability.
This position requires strong problem-solving skills, technical expertise in BI tools and SQL, and excellent communication abilities to effectively support users and maintain high service standards.
KEY RESPONSIBILTIES:
- Design, develop, optimize, and maintain data architecture and data pipelines that adhere to ETL principles and business goals.
- Develop complex queries and data solutions using SQL and SSIS according to defined architectural standards.
- Assist in the design and implementation of data pipelines from a variety of data sources and support the maintenance thereof.
- Ingest large, complex data sets that meet functional and non-functional requirements.
- Enable the business to solve the problem of working with large volumes of data in diverse formats, and in doing so, enable innovative solutions.
- Build bulk and delta data patterns for optimal extraction, transformation, and loading of data.
- Engineer data in the appropriate formats for downstream consumption for analytics or Enterprise applications.
- Work with various stakeholders to understand data requirements and apply technical knowledge of data management to solve key business problems.
- Provide support in the operational environment with all relevant support teams for data services.
- Support test and deployment of new services and features.
- Bachelor’s degree in computer science, Business Informatics or equivalent.
- Minimum of 3 years of relevant data engineering experience, preferably within the retail sector.
- Experience in credit and credit risk will be advantageous.
- Expertise with SQL, SSIS and data analysis and experience with at least one programming language (Python).
- Knowledge of working with DBT would be advantageous.
- Experience developing and maintaining data warehouses in big data solutions.
- Worked with BI tools such as Power BI, Tableau, Pyramid or others.
- Conceptual knowledge of data and analytics, such as dimensional modelling, ETL, ELT, reporting tools, data governance, data warehousing, and structured and unstructured data.
- Knowledge and experience regarding cloud technologies, Python and Databrick will be advantageous.
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries Advertising Services
Referrals increase your chances of interviewing at ExecutivePlacements.com - The JOB Portal by 2x
Get notified about new Data Engineer jobs in Cape Town, Western Cape, South Africa .
Cape Town, Western Cape, South Africa 3 weeks ago
Software Engineer - 85 Percent Average in MatricCape Town, Western Cape, South Africa 1 week ago
Cape Town, Western Cape, South Africa 6 days ago
Cape Town, Western Cape, South Africa 1 week ago
Pinelands, Western Cape, South Africa 6 days ago
City of Cape Town, Western Cape, South Africa 2 weeks ago
Cape Town, Western Cape, South Africa 2 weeks ago
Cape Town, Western Cape, South Africa 1 month ago
Junior Software Development Engineer (CPT)Cape Town, Western Cape, South Africa 1 week ago
Cape Town, Western Cape, South Africa 1 month ago
Software Developer (Full Stack Developer)Cape Town, Western Cape, South Africa 2 weeks ago
Cape Town, Western Cape, South Africa 2 days ago
City of Cape Town, Western Cape, South Africa 2 weeks ago
Cape Town, Western Cape, South Africa 1 week ago
Software Engineer (Python) - Supply ChainCape Town, Western Cape, South Africa 1 week ago
Software Engineer (Python) - Supply ChainCape Town, Western Cape, South Africa 1 week ago
Cape Town, Western Cape, South Africa 2 days ago
City of Cape Town, Western Cape, South Africa 2 weeks ago
City of Cape Town, Western Cape, South Africa 2 weeks ago
Front-end Web Developer (CH1137) (FULLY REMOTE )Cape Town, Western Cape, South Africa 2 months ago
Cape Town, Western Cape, South Africa 2 months ago
Cape Town, Western Cape, South Africa 1 month ago
Stellenbosch, Western Cape, South Africa 1 day ago
Cape Town, Western Cape, South Africa 1 week ago
Cape Town, Western Cape, South Africa 3 weeks ago
Cape Town, Western Cape, South Africa 2 days ago
Cape Town, Western Cape, South Africa 6 months ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Descriptions
Location: Gauteng / Western Cape (Work remotely with regular in-person meetings).
Reports to: The appointee will report to the Support Manager.
Main purpose of role
The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and architectures that support real-time and batch processing of data. This role ensures high data quality, availability, and reliability for internal teams and clients, enabling efficient analytics, reporting, and decision-making across supply chain software implementations.
Key Outcomes
- Build and maintain scalable data pipelines to support client and internal analytics needs.
- Ensure reliable analytics as well as integration between our inhouse systems (Warehouse Management Systems, Transport Management Systems, Digitisation systems, etc.) and external platforms.
- Collaborate with the Implementation, Support, and Technical teams to deliver data-driven insights.
- Implement data quality, governance, and security best practices.
- Support client reporting and dashboards.
Key Responsibilities includes, but is not limited to:
- Design and build data pipelines to extract, transform, and load (ETL/ELT) data from diverse sources including WMS, TMS, ERPs, and APIs.
- Work with SQL/NoSQL databases to manage and query structured and unstructured data.
- Integrate cloud services with on-premises systems to enable hybrid data solutions.
- Develop scripts and automation to support data validation, transformation, and migration tasks during implementations.
- Collaborate with Business Intelligence and Analytics teams to ensure seamless data flow into reporting systems.
- Monitor, debug, and optimise performance of data workflows.
- Maintain data documentation, schemas, and lineage.
- Ensure data compliance with POPIA and other relevant regulations.
Education
- Bachelor’s or related degree in Computer Science, Information Systems, Engineering, or a related field.
Experience
- 3+ years in data engineering or similar role.
- Experience in the logistics, supply chain, or ERP domain preferred.
Technical Skills
- Strong SQL development and optimization skills.
- Proficiency in Python for data manipulation and integration.
- Experience in HTML, CSS & JS
- Experience with ETL tools
- Familiarity with cloud platforms (Azure, AWS, or GCP) is advantageous.
- Knowledge of data warehousing concepts.
- API integration experience (REST/SOAP).
- Experience with tools like Power BI or Tableau is advantageous.
Soft Skills
- Strong problem-solving and analytical thinking.
- Excellent communication and stakeholder collaboration skills.
- Detail-oriented and highly organized.
- Ability to work independently and within cross-functional teams.
- Determination to master new software & technologies
Working Conditions
- Flexibility for travel between provinces in South Africa as well as cross borders (Africa).
- Needs to be available to work infrequent hours. Especially during client visit travelling.
- Initial and on-the-job training to be provided.
- Competitive salary and benefits package.
- Opportunity to work on a variety of challenging and rewarding projects.
- Collaborative, caring and supportive work environment.
- Salary is based on experience and will be discussed during the interview.
- Gross Package includes a laptop, cell phone and internet router.
- Gross package does not include medical aid & pension fund contributions.
- Travel subsistence is paid on a travel base OR the use of the company’s fleet car is provided.
To Apply:
Please submit your 2- 3-page CV and cover letter to
We are an equal-opportunity employer and value diversity at our company.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Descriptions
Turn raw numbers into stories.
Design pipelines that fuel insights and strategy.
Perfect role for a mid-level Data Engineer with analytical flair.
You’ll collect, clean, and organise data from multiple sources, conduct in-depth analysis, and present your findings in ways that resonate with both technical and non-technical audiences.
Alongside your core analytics work, you’ll also collaborate with diverse teams, support predictive modelling efforts, and build pipelines that ensure reliable, accurate, and scalable data delivery.
This role is ideal for someone with a mix of technical expertise and storytelling ability—someone who doesn’t just crunch data but makes it speak.
Skills & Experience- 4+ years’ experience in data analysis or engineering
- Strong background in SQL, Python, R, or Excel
- Basic knowledge of visualization platforms such as Power BI
- Familiarity with ETL processes, data pipelines, and cloud platforms (AWS, Azure, GCP, Ab Initio, Teradata) advantageous
- Excellent communication skills, with the ability to simplify technical terms for all audiences
- Strong analytical and problem-solving skills
- Bachelor’s degree in Data Science, Statistics, Mathematics, Computer Science, Economics, or related
Contact DYLAN MAWONA on or , quoting the Ref: CTI
Connect with us on and Register your CV to create a profile or to view all our Jobs in Information Technology.
Seniority level- Mid-Senior level
- Full-time
- Analyst, Information Technology, and General Business
- Financial Services
- IT System Data Services
- Technology, Information and Media
Data Engineer
Posted today
Job Viewed
Job Descriptions
Data Engineer position available in Johannesburg.
PBT Group is looking for an experienced Data Engineer with deep expertise in Azure Data Factory (ADF) and Databricks to design, develop, and optimise scalable data solutions.
This role is perfect for a proactive problem-solver who thrives in a fast-paced environment and enjoys working at the intersection of data engineering, cloud innovation, and business impact.
What You’ll Do:
- Design, develop, and maintain high-performance data pipelines using ADF and Databricks.
- Transform and integrate data from diverse sources to enable advanced analytics and reporting.
- Optimise workflows for scalability, performance, and cost-effectiveness in Azure.
- Collaborate with analysts, data scientists, and business teams to deliver clean, reliable, and governed data.
- Monitor and improve data quality, security, and compliance across platforms.
- Contribute to DevOps practices, code reviews, and continuous improvement initiatives.
What We’re Looking For:
- 3+ years’ experience as a Data Engineer (or similar).
- Proven expertise in Azure Data Factory and Databricks (Spark, Python, SQL).
- Strong background in ETL/ELT pipelines within cloud-based architectures.
- Proficiency in SQL and experience with both structured and unstructured data.
- Familiarity with Azure Synapse, Data Lake, and version control (Git).
- Solid understanding of data governance and security best practices.
Bonus Skills (Nice to Have):
- Experience with CI/CD pipelines and Infrastructure-as-Code (Terraform, Azure DevOps).
- Exposure to real-time data processing or machine learning pipelines.
Data Engineer
Posted today
Job Viewed
Job Descriptions
We are seeking a skilled Data Engineer who will be responsible for handling the delivery of data and information relating to business intelligence. They design, develop, and maintain data warehouses and analytics to meet an enterprise's business analysis and reporting needs. This is a contract role renewable.
Job DescriptionExperience required:
- 6 years of total experience with Data Engineering using SQL Server stack (SSIS, SSRS, SSAS) and/or Azure Stack
- 3 years working experience with BI dashboard development e.g. Power BI
- Experience developing, testing for quality assurance, administering RDBMS, and monitoring of databases
- Experience with a Cloud migration of Data Warehouse is highly advantageous
- High proficiency in dimensional modelling techniques and their applications
- Experience in Data Science development in Python or R is advantageous
Qualifications:
- A bachelor’s degree in IT or a related field
- Exposure to working in an Agile methodology
Be The First To Know
About The Latest Data engineer Jobs in SouthAfrica!
Data Engineer
Posted today
Job Viewed
Job Descriptions
Nedbank, Johannesburg, Gauteng, South Africa
OverviewJoin a collaborative and technically driven team as a Data Engineer, where your skills in SQL and SSIS will be central to maintaining and enhancing critical data operations. This role offers the opportunity to work on a custom-built data system and contribute to innovative projects that introduce new technologies such as Ab Initio, Python, and C#. You’ll be part of a supportive environment that values teamwork, open communication, and continuous learning. With direct client engagement and the chance to influence technical decisions, this position is ideal for professionals who are passionate about solving complex data challenges and driving meaningful improvements.
Job Details: Data Engineer, Nedbank. Location: 135 Rivonia Road, Sandown, Johannesburg, Closing date: 5 August 2025. Requisition nr: .
Job Responsibilities- Contribute to a culture conducive to the achievement of transformation goals by participating in Nedbank culture building initiatives (e.g., staff surveys).
- Participate and support corporate social responsibility initiatives for the achievement of key business strategies.
- Identify and recommend opportunities to enhance processes, systems and policies and support implementation of new processes, policies and systems.
- Deliver work according to customer expectations by prioritizing, planning and implementing requirements.
- Utilize resources by adhering to standards, policies and procedures.
- Align and continuously improve set processes by identifying innovation opportunities.
- Identify and mitigate risk by executing within governance.
- Resolve incidents by logging and tracking through correct channels; stay informed on legislation and industry changes affecting the role.
- Understand and embody the Nedbank vision and demonstrate the values in interactions with team and stakeholders.
- Improve personal capability and stay abreast of developments by identifying training opportunities and career progression.
- Ensure personal growth and effectiveness by completing learning activities and maintaining certifications within specified time frames.
- Ensure information is provided correctly to stakeholders by maintaining knowledge sharing within the team.
- Structure data into compliance standards by adhering to metadata governance procedures and Nedbank standards.
- Manage final transformed data content according to prescribed standards for reviewing and publishing.
- Assist/govern population of datamart and metadata repository by complying with standards, systems, processes and procedures.
- Support business units by providing consulting services that deliver data and information relevant to their business.
- Contribute to internal/external information sharing sessions by attending formal and informal meetings.
- Manage vendor relationship interactions in line with vendor management guidelines.
- Advanced Diploma / National First Degree
- Degree in Information Technology or Business Management, Mathematics/Statistics
- Data Management (DAMA) Certification; Certification or formal training in relevant technology
- 8 years relevant experience, of which 3-5 years in a data management/business role
- Built and maintained stakeholder relationships
- Client and Relationship Results
- Developed and implemented communications strategy
- Improved processes and culture
- Manage internal processes and relationships
- Self-management and transformation support
- Cloud Data Engineering (Azure, AWS, Google)
- Data Warehousing
- Databases (PostgreSQL, MS SQL, IBM DB2, HBase, MongoDB)
- Programming (Python, Java, SQL)
- Data Analysis and Data Modelling
- Data Pipelines and ETL tools (Ab Initio, ADB, ADF, SAS ETL)
- Agile Delivery
- Problem solving skills
- Coaching
- Communication
- Initiating Action
- Managing Work
- Quality Orientation
- Technical/Professional Knowledge and Skills
Preference will be given to candidates from underrepresented groups. Please contact the Nedbank Recruiting Team at .
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Descriptions
Overview
Job Purpose: Design, implement, and maintain the reporting database architecture; Build and optimize ETL pipelines; Develop and maintain semantic models; Support business projects through reporting.
Responsibilities- Design, implement, and maintain the reporting database architecture
- Build and optimize ETL pipelines
- Develop and maintain semantic models
- Support business projects through reporting
- Three years of experience in data engineering or related field
- MS SQL and SSMS
- NoSQL
- SSIS and ETL pipeline development
- Diploma / Degree in Information Systems, computer science or related field
- Strong understanding of data modelling and data warehouse design
- Ability to create entity relationship diagrams (ERDs)
- Exposure to cloud platforms is an added advantage
- Microsoft 365 suite
- Python
- R (programming language)
- Power BI
- Microsoft Power Automate
- Microsoft Forms
- Microsoft Dataverse
- Working knowledge of other BI technologies
- Project management skills
- Good communication skills
- Analytical skills (Information processing)
- Interpersonal skills
- Good understanding of IT infrastructure
- Good understanding of Microsoft applications and SQL Databases
- Experience with APIs
- Experience with data warehousing and architecture
- Ability to take ownership of a data problem or project from beginning to completion
- Excellent written, communication and presentation skills
- Excellent organizational and follow-up skills
- Competent in problem solving (problem analysis)
- Ability to create, update, and maintain technical documentation with an architectural focus, including ETL processes and ERDs
- Pay attention to detail
- Monitoring
- Record keeping and documentation
- Planned change control knowledge
- Ability to work in a team
- Ability to work under pressure
- Deadline orientated
- Apply company ISO policies relating to data
- Self-starter
- Intrinsic motivation
- Ability to work on their own without constant supervision
- Driven
- Teachable
- Open to feedback and to constructive criticism
- Confident in their area of knowledge
- Ability to make decisions
- Adapts well to change
Data Engineer
Posted today
Job Viewed
Job Descriptions
Collaborate with data scientists and business stakeholders to design, develop, and maintain efficient data pipelines feeding into the organization's data lake.
Ensure the data lake contains accurate, up-to-date, and high-quality data, enabling data scientists to develop insightful analytics and business stakeholders to make well-informed decisions.
Utilize expertise in data engineering and cloud technologies to contribute to the overall success of the organization by providing the necessary data infrastructure and fostering a data-driven culture.
Demonstrate a strong architectural sense in defining data models, leveraging the Poly-base concept to optimize data storage and access.
Facilitate seamless data integration and management across the organization, ensuring a robust and scalable data architecture.
Take responsibility for defining and designing the data catalogue, effectively modelling all data within the organization, to enable efficient data discovery, access, and management for various stakeholders.
Key Responsibilities:- Design, develop, optimize, and maintain data architecture and pipelines that adhere to ETL principles and business goals.
- Develop complex queries and solutions using Scala, .NET, Python / PySpark languages.
- Implement and maintain data solutions on Azure Data Factory, Azure Data Lake, and Databricks.
- Create data products for analytics and data scientist team members to improve their productivity.
- Advise, consult, mentor, and coach other data and analytic professionals on data standards and practices.
- Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.
- Lead the evaluation, implementation, and deployment of emerging tools and processes for analytic data engineering in order to improve our productivity as a team.
- Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
- Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
- Collaborate with other team members and effectively influence, direct, and monitor project work.
- Develop strong understanding of the business and support decision making.
- 10 years of overall experience & at least 5 years of relevant experience.
- 5 years of experience working with Azure data factory & Databricks in a retail environment.
- Bachelor's degree required; Computer Science Engineering.
- 5+ years of experience working in data engineering or architecture role.
- Expertise in SQL and data analysis and experience with at least one programming language (Scala and .NET preferred).
- Experience developing and maintaining data warehouses in big data solutions.
- Experience with Azure Data Lake, Azure Data Factory, and Databricks in the data and analytics space is a must.
- Database development experience using Hadoop or Big Query and experience with a variety of relational, NoSQL, and cloud data lake technologies.
- Worked with BI tools such as Tableau, Power BI, Looker, Shiny.
- Conceptual knowledge of data and analytics, such as dimensional modelling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data.
- Big Data Development experience using Hive, Impala, Spark, and familiarity with Kafka.
- Exposure to machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics.
- Fluency in verbal and written English mandatory.
- Fluency in Spanish & French is useful.
- Internal: CEO & COO of Africa, Managers across various departments, Senior Management, Head of Departments in other regional hubs of Puma Energy.