About Aggreko plc:
Around the world, people, businesses and countries are striving for a better future. A future that needs power and the right conditions to succeed. That’s why at Aggreko, we work ‘round the clock, making sure our customers get the electricity, heating and cooling they need, whenever they need it – all powered by our trademark passion, unrivalled international experience and local knowledge. From urban development to unique commercial projects and even humanitarian emergencies, we bring our expertise and equipment to any location, from the world’s busiest cities to some of the most remote places on earth. Every project is different, so we listen first and design a system supported by our service anywhere, to any scale. Transforming the lives and livelihoods of individuals, organisations and communities across the globe.
Working within a primarily Azure based big data environment, the main focus of the role will be to develop, document and implement data pipelines that will ingest, enhance and surface data for use by Data Scientists, BI Developers and Applications. You will be expected to enhance and improve the current processes and methodologies where relevant. You will also contribute to the development of the environment as well as the technologies utilised within it.
Working with the data
- Identify data sources and implement methods to ingest this data to the data lake
- Understand and implement data transformation techniques where required, utilising both physical data transformation and techniques such as schema on read (e.g. Hive tables)
- Ensure all ingested, transformed and surfaced data is accurate and trusted
- Monitor and manage any errors or failures within the environment and endeavour to prevent recurrences of these where possible
- Support bug fixing and problem solving as well as enhancing the overall capability of the environment
Working with the team
- A key requirement is the ability to work collaboratively with multiple teams across ATS, to support development and enhancement of the data lake
- Provide support to internal customers as required, understanding their needs and prioritising appropriately
- Collaborate effectively with BI and Analytics team members, engaging in frequent knowledge sharing, and promoting team working
- Support communications with stakeholders in the business to better understand technical requirements, with a focus on understanding both the data and its availability
- Support communications with stakeholders to present findings, and contribute to identifying new opportunities where Analytics can drive value and performance across the business
- Support project initiatives as defined in the Macro plan, and any other duties as determined by the senior data scientist
Maintaining best practice
- Catalogue all data at each stage of the pipeline i.e. source, ingested, transformed etc.
- Ensure all data pipelines are fully and clearly documented and that this documentation is maintained to a high standard
- Maintain best-practice standards in coding, reporting and communication
Indicative performance measures / Key interdependencies:
A consistent strong performer who can interact with and build relationships with ATS teams and our internal customers. Someone who is comfortable working in a flexible way, adapting to new tools and technologies and promoting innovation. The role may include a small element of international travel.
- Experience and exposure to enterprise level Analytics, BI, NoSQL and cloud environments
- Experience in development within a data lake environment, as well as relevant Microsoft certifications an advantage
- Proven experience within a large commercial IT environment
- Excellent understanding of data lake concepts
- Knowledge and experience using at least 3 of the following technologies in Azure:
- Data Lake Store Gen2
- Logic Apps
- Azure Functions
- As well as experience in these programming languages
- Database technologies (SQLServer, MySQL, NoSQL)
- Data languages including SQL and Python and PySpark
- Exposure to Agile methodologies would be an advantage
Collaboration and communication
- Excellent team player with the ability to effectively collaborate with the team and the business, as well as the ability to be proactive and work on initiative
- Good verbal and written communication and presentation skills, including the ability explain complex issues to a diverse audience
- Customer focused and open to receiving and giving honest and constructive feedback
The successful individual will be a motivated self-starter who is comfortable working in a cloud focussed big data environment, within a collaborative team. They will be well organised and able to prioritise workload in line with tight deadlines and work effectively under pressure. There will be a genuine desire towards continuous improvement and innovation, providing best fit technical solutions to business problems.
We’re the people who use our big boxes to make a massive difference. We believe in the positive impact of power and the ability to control temperature. We believe what we do opens up opportunity and creates potential for individuals, communities, industries and societies over the world. We believe when we work together we can do anything. We believe in the power of our team. We’re the people who keep the lights on. And we recruit the best talent, too.
Our four values help us get even better at what we do. It’s the Aggreko way of working – we call it Always Orange.
Always Orange means:
Being dynamic: We’re nimble and are always ready to react to an ever changing world.
Being expert: We know our stuff, we’re great under pressure and we thrive in our busy, fast-paced, deadline-driven environment. We use our experience to make a difference. We know how to challenge and we have the courage of our convictions.
Being together: We play for Team Aggreko and value the expertise of everyone around us. We’re accountable and we hold others to account.
Being innovative: We never miss an opportunity to learn, to look out, or to be better.