Software engineer (Big data context)
- In depth programming skills in Python and eager to learn within this new environment is crucial.
- We need someone who is flexible and can operate in a frequently changing environment that is still under construction.
- Short time to market, short interval releases, frequent demo's at the end of each sprint require these individuals to have a strong sense of urgency working towards a future proof solution.
- They are able to capture feedback on a frequent basis and incorporate this into the design of technical solutions.
- Specific knowledge of Big Data tools and technologies like Hadoop, Spark and OO development. They understand how to apply these tools and technologies to solve big data problems and to develop innovative big data solutions.
Additional skills (general job descripton of a software engineer working in a big data context):
- A software engineer is a technical role that requires substantial expertise in a broad range of software development and programming fields.
- These professionals have knowledge of data analysis, end user requirements analysis and business requirements analysis to develop a clear understanding of the business needs and to incorporate these needs into technical solutions and contribute to the design solution.
- They have a solid understanding of physical database design principles and the system development life cycle and a good understanding of the software development lifecycle within a corporate environment, CI/CD pipelines, unit and integration testing and versioning tool.
- Individuals within the software engineer role are responsible for developing prototypes and proof of concepts for the selected solutions and for implementing complex big data projects. They understand how to apply technologies to solve big data problems and to develop innovative big data solutions.
- They must be team players and need to have a self starter mentality.
- Analysing, designing, developing, constructing and testing new and existing projects in Python
- Developing and designing a framework with the focus on efficiency (starting from a prototype)
- Incorporating enterprise standards into new software projects (CI/CD pipelines, unit and integration testing and versioning)
- Big Data Framework / Hadoop: Cluster, HDFS, Yarn and MapReduce
- Apache Spark
- Big Data Framework / Hadoop: Hive
- Agile: Safe