Job summary

Cape Town, South Africa, Africa
Career Level:
Mid Career (2+ years of experience)
Job type:
Full time

Big Data Engineer (Cape Town)(Ref 117)

About this job


As a Big Data Engineer you will design, develop, deploy and support solutions that leverage the Hadoop big data platform. You will work with Enterprise Architecture, Database and Solution Architects, Business Intelligence Developers and Product Owners to understand business unit requirements and to build solutions to meet their needs and objectives. You will be responsible for the design and development of technical solutions utilizing the big data platform.


Bachelor's degree in Computer Science, Engineering or Information Systems.
Post Graduate degree in Computer Science, Engineering or Information Systems.
Master’s degree in Computer Science, Engineering or Information Systems.(Ideal)
Formal qualification in Enterprise Architecture (e.g. TOGAF) (Ideal)



5 years Experience utilising SQL as a data management language
3-5 years Experience in data integration, architecting, developing, implementing and maintaining Big Data solutions using Hadoop (Hortonworks), HDFS, Sqoop, Pig and/or HIVE and HBase.
5 -7 years Data mining, data warehousing, business intelligence experience
3-5 years Experience with Hadoop Big data, tools and technologies like Hue, HDFS, Python, Java, and other solutions in the eco-system like Impala, Spark, Hive, Kafka, MapReduce, Pig, Sqoop, Flume, Oozie. 
3-5 years Experience in Unix/Linux
3-5 years Experience with Automation tools (ControlM)
3-5 years Experience with one of the distributions (Hortonworks, HDInsight)
3-5 years Database experience with Microsoft SQL Server, HIVE and HBase.
5-7 years Experience with Microsoft office Suite (including Visio)
5-7 years Ability to apply the concepts and develop artifacts from/like 7s framework, Organization Charts, Business Domain Model, Process Flows, Activity Diagram, Data Flow Diagram, Entity Relationships, Product Evaluation, Competitive Comparison Matrix, Mind Maps, Feature Matrix, Roadmap, Architecture diagram, Case Diagrams is required
5-7 years Experience developing Analytic solutions using R.
5-7 years Deep experience in developing enterprise solutions using all aspects of the .NET platform, open source or Java (or any other environment), Web Services, multithreaded programming, designing & building frameworks, enterprise patterns, SQL design & development, & database tuning
Must have strong knowledge of application deployment, network and security concepts like OSI layer, AV, Anti-Malware, Cloud, SaaS, IaaS, Network protocols.
Experience rolling out and/or a strong understanding of the Threat Management platforms, Data Loss Prevention platforms and systems or application security.



Contributes to cost efficiencies

Ensures customer service solutions are aligned to the operational business plan
Ensures continuous process improvement to enable effective operational processes
Conducts research on architectural systems
Contributes to business analysis processes
Provides specialist advice and knowledge sharing
Solutions design and delivery 
Promotes teamwork amongst peers and team members

Job keywords/tags:  the Hadoop big data platform.
Developed by Figo Mago at
You are here: Home Job Seekers Current Jobs