Your role
About Us
Atlas Copco is a Swedish multinational industrial company that was founded in 1873 headquartered in Nacka, Sweden. We are world leaders in Compressors, We also manufactur industrial tools and equipments.We are an IT Service Provider company within Atlas Copco Group called Global IT Hub (India). We are also focusing on Digital transformation Technologies for Data Driven Decisions, Web and Applications developments, Automations, IT Infrastructure etc.Data Analytics Competence is the competence within the Global IT Hub (India).As a
Data Engineer
your mission will be to build and maintain enterprise , production-grade data pipelines, considering business requirements and data platform standards. You must have hands on experience in building ETL flows , using CI CD pipelines , using cloud technologies such as Azure Data lake, ADF, DataBricks, Visualisations tools such as Power BI. Additioanally SQL is a must. Knowledge of Python will be added advantage.Examples of technologies you will be working with: Azure Data Lake Storage, Delta Lake, Azure Data Factory, Databricks, Azure SQL, Azure Dev Ops.
You would be a part of Data Analytics Team of Global IT Hub (India) of Atlas Copco India Private Ltd and supporting Industrial Tools Business Area. As a Data Engineer within our Data Anlytics Competence team, you will lead the management and organisation of data for our custom-built applications and bringing the Data Driven meaningful insights. You will be instrumental at every stage of the project lifecycle from extracting, transforming, and loading (ETL) data, to managing data pipelines and cloud infrastructure.
Key Responsibilities And Responsibility
- Should have strong 3–6-years’ experience on ETL and ELT.
- Expert knowledge of SQL with capabilities to perform tune complex SQL queries.
- Mandatory Skill: Azure Data Bricks, SQL, Pyspark.
- Should have strong hands-on experience building data pipelines.
- Design and implement effective database solution and data models to store and retrieve data.
- Provide guidance and advice on data and solution to stakeholders.
- Hands-on experience on Agile methodology, Nice to have Jira exposure.
- Effective problem solving and analytical skill, ability to manage multiple projects.
- Experience with any data virtualisation platform.
- Be able to build and maintain ETL/ELT data pipelines independently.
- Be able to assemble large, complex data sets from multiple data sources that meet business requirements.
- And be able to do this whilst being the sole data engineering resource as part of a larger project team.
- Be able to create documentation on the data pipelines and data models developed.
- Design/Build/Test/Deploy/Operate/Support/Change
- The ideal candidate has a degree in Computer Science, Information Technology, Business economics or equal through experience.
- Good knowledge on BI development principles, time intelligence functions, dimensional modelling and data visualization is required.
- Mentor and guide Junior team members
- Knowledge minimum 1- 2 years’ experience with professional BI development & data visualization is preferred.
- You are experienced in data modelling.
- Able to lead small projects in data engineering space.
- You have a good understanding of data warehousing and relational database concepts & technologies.
To succeed, you will need
Competences
- You are a team player and able to motivate people.
- You are a customer focused, flexible, enthusiastic.
- You have good leadership skills.
- You have very strong analytical skills.
- Understanding of manufacturing and Engineering process & flows is seen as added value.
- You have a strong drive, and you take initiative.
- You are flexible, eager to learn new things and able to adapt in a fast-changing world.
- You are result oriented and quality focused, both in terms of deliveries and processes.
- You work systematically and accurately with a strong focus on user experience.
Stakeholder Management
- You can discuss with stakeholders about their requirements and clearly communicate back your understanding of the requirements.
- You can work with the stakeholders to understand the big picture and work towards it
- Educate users regarding the functionalities provided and train them in using the visualization tool efficiently.
- Work with team members and other users from the organization at different levels for performance improvement and suggestions
Must Have
- Azure Data Bricks, Azure Data factory, SQL, Python, PySpark, ETL
- Excellent knowledge (oral and written) of English is required.
Nice to have.
- Knowledge/experience of Microsoft Azure applications
- Experience of a Source Control system such as Git or SVN
- Knowledge of Power BI, Azure services, Python.
- Knowledge of Azure Data Lake Storage
- Knowledge of Databricks Unity Catalog
- Knowledge of Azure DevOps
Education
- Any Graduate (B.E / B.Tech/MCA/MTech/BSc/MSc/MCs) in Engineering with 3-6 years of experience in Data Engineering field , projects and tools.
Knowledge
- You have strong technical knowledge, specific knowledge of the Data Analysis tools, methodologies, reporting etc.
- Good analytical skills: You feel comfortable working with numbers/data, e.g. collecting and analysing trend graphs, combining the info with insights from ERP and previous site visits, … to come to a sound conclusion.
- You are fluent in English: both written and spoken.
- Working Knowledge of Agile scrum ceremonies.
- Knowledge of industry paradigms such as Data Fabric and Data Mesh will be nice to haves for both roles.
Personality Requirements
- You like open communication and enjoy working with different cultures You are decisive (not afraid of making errors and willing to learn by doing)
- Open minded; expect the unexpected stay sceptical when viewing at the data and info)
- Not afraid to approach other stakeholders (customer centre specialists, local engineering, or technical support specialists)
- You work with a structured approach when collecting info and data, but also when closing the feedback loop to further increase your learning.
In return, we offer
What We Offer
- Flexible working hours.
- Flexible office and home working policies.
- International travel opportunities.
- A modern infrastructure providing you with the latest tools and technologies at your disposal.
- A challenging environment which contributes to your constant learning and professional growth.
- To be part of the data competence team which is constantly growing.
- Depending on the country you are enrolled in group healthcare insurance plans covering your medical needs.
- A chance to become part of a global, innovative company, supporting sustainable productivity.
- You get the opportunity to bring revolutionary ideas fostering innovation and execute qualified ideas.
- A friendly culture with immense professional and personal development, education, and opportunities for career growth.
- Free access to LinkedIn Learning and many other internal and external trainings.
- Atlas Copco offers trainings on a regular basis to acquire new skills.
- You get the opportunity to make the world a better place through our sustainable goals and by contributing and being part of our Water for all projects.
- Friendly and open culture of Swedish company. Very high visibility in the organization with "No door" culture, you can always talk to anyone in the organization.
- Free Canteen & Transport facility.
Job locationHybridThis role offers a hybrid working arrangement, allowing you to split your time between working remotely and being on-site at our Atlas Copco in Vadodara, India.
Uniting curious minds
Behind every innovative solution, there are people working together to transform the future. With careers sparked by initiative and lifelong learning, we unite curious minds, and you could be one of them.