Experienced Hadoop Engineer
About the company
ÅF Pöyry is an international leader within engineering, design and advisory services. We create solutions to support our customers worldwide to act on sustainability as well as the global trends of urbanisation and digitalisation. We are more than 16,000 devoted experts within the fields of infrastructure, industry and energy operating across the world to create sustainable solutions for the next generation.
About the job
Do you want to be a part of a cutting edge ÅF team working with the latest technology within Big Data and Hadoop full stack development, in a project with the best developers in the region and with the most attractive customer right now? We are looking for developers with burning passion for complex development in a team where you will be able to contribute and learn from highly skilled team members. You will be working in an inspiring and challenging environment.
- Responsible for maintaining and scaling production Hadoop, HBase, Kafka, and Spark clusters.
- Responsible for the implementation and ongoing administration of Hadoop infrastructure including monitoring, tuning and troubleshooting.
- Provide hardware architectural guidance, plan and estimate cluster capacity, and create roadmaps for the Hadoop cluster deployment.
- Improve scalability, service reliability, capacity, and performance.
Who are you?
We are looking for a solid Hadoop engineer focused on operations to administer/scale multipetabyte Hadoop clusters and the related services that go with it. This role focuses primarily on provisioning, ongoing capacity planning, monitoring, management of Hadoop platform and application/middleware that run on Hadoop. You have a tools-first mindset, you build tools for yourself and others to increase efficiency and to make hard or repetitive tasks easy and quick. In addition you are organized, focused on building, improving, resolving and delivering, and a good communicator in and across teams.
You have a Bachelors or Master Degree in Computer Science or similar technical degree.
- Overall 10+ years of work-experience with at least 5+ years of Hadoop experience in production, in medium to large clusters.
- Hands on experience with managing production clusters (Hadoop, Kafka, Spark, more).
- Strong development/automation skills. Must be very comfortable with reading and writing Python and Java code.
- Experience with GIS, Geographical Data and Toolkits such as JTS, ArcGIS, QGIS,OpenJump etc. Experience with Configuration Management and automation.
We are looking for someone who wants to be part of ÅF’s success story. Are you passionate about technology development? Do you like to work together to find the best solution? Then we can offer you career opportunities in a modern workplace with challenging assignments and exciting projects all over the world.
The ÅF Group is ranked as one of Sweden’s most popular employer among engineers. At ÅF you will be involved in developing innovative and sustainable solutions within infrastructure, energy and industry. We are always looking for the sharpest skills that can create a future society together with us. We hope you will learn as much from us as we will learn from you.