Home
Jobs
Companies
Resume

172 Elt Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 7 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role you will serve a critical function within Organizational, Planning, Analytics & Insights with a goal of enterprise-wide, long-term workforce transformation by connecting people, financial, procurement & capability data to enable business insights & decisions. The Data Engineer will work with the Tech and Data lead for OPA&I and will be responsible for identifying requirements and building data analytics solutions and visualizations for OPA&I. Roles & Responsibilities: Understand, Connect & Integrate Data: Develop and maintain robust methods to connect data from various sources including HR, Finance, Procurement, and Activities. Ensure seamless integration and synchronization between systems and databases. Implement strategies and tools to efficiently manage and process structured and unstructured data. Establish and enforce data validation procedures to ensure data accuracy and consistency. Deliver Data Insights: Develop and implement interactive dashboards and visualizations to provide collaborators with easy access to workforce and financial planning insights. Collaborate with multi-functional teams to understand their data needs and tailor visual solutions accordingly. Apply advanced analytics, models, and GenAI solutions to uncover trends, patterns, and actionable insights. Present data findings in a clear and compelling manner that enhances decision-making. Integrate Business Processes with Data Systems Holistically Develop and deploy solutions that seamlessly support business workflows with data infrastructure. Consistently pursue enhancements to boost efficiency and alignment using data-driven approaches. Partner with business collaborators to advise on comprehensive solutions. Oversee projects aimed at improving data quality and operational performance within business processes. Guarantee consistency to data standards and maintain data integrity across all business systems. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of computer science or quantitative field experience OR Bachelor’s degree and 3 to 5 years of computer science or quantitative field experience OR Diploma and 7 to 9 years of computer science or quantitative field experience 2+ years of development experience with Databricks (or Snowflake), including cluster setup, execution, and tuning. Experience with other ETL tools like Alteryx is also acceptable. 2+ years of experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Experience with one or more programming languages, Python, R, SAS, Scala, or Java. Experience with common data processing librariesPandas, PySpark, SQLAlchemy. Experience building dashboards and end-user interfaces using visualization tools such as Power BI, Tableau and Spotfire Experience working with collaborators to define requirements and design data solutions Experience building data analytics and business intelligence solutions with financial and workforce data Experience performing exploratory and targeted data analyses using descriptive statistics and other methods. Preferred Qualifications: Experience with NLP or GenAI tools like OpenAI. Experience with software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Conceptual understanding of DevOps tools (Ansible/ Gitlab CI/CD / GitHub / Docker /Jenkins) Experience working in Agile-based teams Experience with R/Python-based visualization frameworks such as Shiny, Dash, Streamlit Professional Certifications: Databricks (preferred) Tableau/Power BI/Alteryx (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

7 - 9 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Immediate Hiring for Data Analyst -Harita Techserv @ Bangalore location. Experience : 7 - 9 Years Notice period : Immediate to 30 Days Job Description : * 7 to 9 years of experience in cloud data and analytics platforms such as AWS, Azure, or GCP * 3+ years experience with Azure cloud Analytical tools is a must *5+ years of experience working with data & analytics concepts such as SQL, ETL, ELT reporting and report building, data visualization, data lineage, data importing . * Strong coding skills in languages such as SQL, Python, PySpark • Experience in data streaming technologies such as Kafka or Azure Event Hubs * Experience in software development on a team using Agile methodology

Posted 2 months ago

Apply

5 - 7 years

8 - 10 Lacs

Noida

Work from Office

Naukri logo

What you need BS in an Engineering or Science discipline, or equivalent experience 5+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 3 years experience in a data and BI focused role Experience in data integration (ETL/ELT) development using multiple languages (e.g., Python, PySpark, SparkSQL) and data transformation (e.g., dbt) Experience building data pipelines supporting a variety of integration and information delivery methods as well as data modelling techniques and analytics Knowledge and experience with various relational databases and demonstrable proficiency in SQL and data analysis requiring complex queries, and optimization Experience with AWS-based data services technologies (e.g., Glue, RDS, Athena, etc.) and Snowflake CDW, as well as BI tools (e.g., PowerBI) Willingness to experiment and learn new approaches and technology applications Knowledge of software engineering and agile development best practices Excellent written and verbal communication skills

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: Design and implement scalable and efficient data systems, ensuring robust data architecture. Develop innovative data products that align with business goals and enhance data accessibility. Ensure data integrity and confidentiality through expert application of data security and encryption protocols. Leverage cloud technologies, particularly AWS, to optimize data operations and enhance performance. Implement and manage RBAC security models within Snowflake to maintain secure data environments. Collaborate across business and technical teams to ensure effective communication and project delivery. Manage large-scale data projects from inception to completion, ensuring timely and successful outcomes. Required Skills: Bachelors degree in software engineering, Computer Science, or a related field, or 5+ years of experience in data engineering. 5+ years of experience working with Snowflake in a data engineering capacity. Strong proficiency in data architecture and demonstrated experience in data product development. Advanced proficiency in Python and SQL for data transformation and analytics. Deep understanding of RBAC security models and secure data connection protocols (JDBC, ODBC, etc.). Excellent communication skills for effective collaboration across diverse teams. Exposure to systems integrations for seamless data interoperability. Proficiency in query optimization and algorithmic efficiency. Preferred Qualities: Knowledge of machine learning for integrating basic models into data processes. Strong collaborative skills and the ability to work effectively in team-oriented environments. Experience with Cortex AI, LLM tools, and developing Streamlit applications. Familiarity with data governance, security, and compliance best practices. Certifications in Snowflake, data engineering, or related fields. Experience with data visualization tools like Tableau or Power BI, and ELT/ETL tools such as Informatica.

Posted 2 months ago

Apply

8 - 13 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Role Description Corporate Banking is a technology centric business, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for the business. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help rebuild the core of some of our most mission critical processing systems from the ground-up Our Corporate Bank Technology team is a global team of 3000 Engineers (and growing!) across 30 countries. The primary businesses that we support within Corporate Bank are Cash Management, Securities Services, Trade Finance and Trust & Agency Services. CB Technology support these businesses through CIO aligned teams and also by 'horizontals' such as Client Connectivity, Surveillance and Regulatory, Infrastructure, Architecture, Production and Risk & Control. In addition to providing cash management services like Payments to our customers, Corporate Bank is the payment service provider for the entire Deutsche Bank organization. As such, we have been tasked with ensuring technology and security risk for payments is within risk tolerance bank wide. We are looking for a hybrid Technology specialist to join our Risk and Control team to be responsible for a hybrid role spanning Information Technology and Information Security related Audit Management scope. This involves in summary, Hands-on technical data analysis and control process improvement, Control effectiveness testing, Control Uplift remediations activities and overall ensuring technology and security controls are implemented effectively and sustainably. The Risk and Control Team ensures the Bank's information control priorities are effectively implemented across Corporate Bank Technology. The team offers dedicated support for each Chief Information Officer (CIO) business line, advisory services for control responses, and program management services for broad control uplifts. The team's mission is to reduce the organization's technology risk exposure by implementing key bank controls, ensuring appropriate and timely resolution of audit issues, and participating in the Bank's design of control implementations. Therefore, your role would be integral in supporting the front-line management in identifying, assessing/measuring risks, identifying remediation actions, and monitoring risks. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. Your Key Responsibilities: At Risk & Control Governance team, you will be responsible for hybrid of activities involving Information Technology and Information Security controls and will partner with CIO Application teams and Risk Leads to ensure overall risk posture for the area is improved. Focus will be on creating and managing dashboards that helps senior stakeholders an appealing and interactive view on critical insights from data. Key responsibilities for the role includes, Design and build interactive dashboards that effectively communicate complex data stories to stakeholders Utilize Tableau's advanced features, such as calculated fields, parameters, and custom visualizations, to create insightful and visually appealing dashboards Should be comfortable with creating complex calculations in Tableau Performing data blending, data aggregation and create complex calculated fields in Tableau Ensuring data accuracy and consistency across all visualizations by procuring data from various sources Collaborating closely with business users to understand their requirements and translate them into technical specifications Providing training and support to end-users on Tableau functionalities Conducting thorough testing of Tableau dashboards and reports to ensure they meet business requirements Identifying and resolving any data discrepancies or visualization issues Your skills and experience: Extensive proven experience as a Senior Tableau Developer in a similar organisation with at least 8+ years A solid understanding of data concepts, data warehouse, SQL, rational databases, normalization, proficiency in use of query and reporting analysis tools Excellent analytical skills with a strong attention to detail Strong understanding of ELT/ETL and integration concepts and design best practices Deep understanding and experience with Tableau Desktop, Tableau Server and Tableau Prep for creating dashboards and visualizations Proficiency in SQL and data modelling to transform data, build queries and create relationships among tables Analytical skills to interpret complex data and translate it into easily understandable visualizations and reports Familiar with standard data tools such as Excel (macros, pivot tables, etc.) Collaborate with stakeholders to understand and prioritize functional requirements

Posted 2 months ago

Apply

5 - 10 years

20 - 35 Lacs

Pune

Hybrid

Naukri logo

Job Description: Data Engineer We are seeking a skilled Data Engineer with a deep understanding of data integration and engineering techniques to design and deliver robust, scalable, and efficient data solutions. In this role, you will focus on the design, implementation, automation, and optimization of data pipelines and enterprise-wide applications, such as data lakes and data warehouses, to manage and analyze large and complex datasets effectively. Key Responsibilities Design, develop, and maintain high-performing data pipelines for extracting, transforming, and loading (ETL/ELT) data. Collaborate with QA teams to ensure the accuracy and integrity of data pipelines through rigorous testing. Manage source code repositories to maintain software versioning and collaboration. Utilize workflow orchestration tools to schedule and execute pipeline workflows seamlessly. Implement CI/CD techniques and platforms to automate pipeline deployments and ensure continuous delivery. Required Skills & Qualifications Proven expertise in Data Engineering , with a track record of successfully delivering complex projects. In-depth knowledge of ELT/ETL processes , Data Warehousing , and Data Visualization techniques. Advanced SQL skills, with the ability to write and optimize queries for relational database platforms. Experience working within major public cloud ecosystems such as AWS , Azure , or Google Cloud Platform (GCP) . Proficiency in workflow orchestration and scheduling solutions. Hands-on experience with tools like Snowflake and Matillion . Preferred Qualifications Experience in building automated testing frameworks for validating data pipelines. Solid understanding of Continuous Integration and Delivery (CI/CD) principles and their practical application. This is an exciting opportunity to work with cutting-edge technologies and drive impactful data solutions within a dynamic and collaborative environment.

Posted 2 months ago

Apply

4 - 7 years

18 - 20 Lacs

Delhi NCR, Gurgaon

Hybrid

Naukri logo

What You'll Do Design, document, and implement data pipelines to feed data models for consumption in Snowflake using dbt and Airflow. Ensure data correctness and completeness in engineering pipelines for accurate analytical dashboard insights. Monitor and triage technical challenges in critical situations requiring immediate resolution. Evaluate technical solutions and develop MVPs or PoCs to support research efforts. Engage with external stakeholders to stay updated on data security issues and trends. Review and provide feedback on the work of other tech team members to support their growth. Implement data performance and security policies in alignment with governance objectives and regulatory requirements. Mentor and develop team members to enhance their skills and capabilities. What You'll Bring Essential Education • Bachelor's degree or equivalent combination of education and experience. • Bachelor's degree in information science, data management, computer science or related field preferred. Essential Experience & Job Requirements • 5+ years of IT experience with a major focus on data warehouse/database-related projects • Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake. • Experience in other data platforms: Oracle, SQL Server, MDM, etc • Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. • Experience in data modeling and relational database design • Well-versed in applying SCD, CDC, and DQ/DV framework. • Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) • Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake • Good to have strong programming/ scripting skills (Python, PowerShell, etc.) • Good to have familiarity with HR systems such as recruitment workflows, applicant tracking systems (ATS), talent acquisition analytics and HR data structures • Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) • Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Who You'll Work With Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) Consulting and internal Data Product Portfolio teams across BCG Additional info You have experience in data warehousing, data modeling, and the building of data engineering pipelines. You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. You are good at analyzing performance bottlenecks and providing enhancement recommendations; you have a passion for customer service and a desire to learn and grow as a professional and a technologist. Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. • Collaborating with product owners to identify requirements, define desired outcomes, and deliver trusted results. • Building processes supporting data transformation, data structures, metadata, dependency, and workload management. • In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python. • Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred). • Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) • Extremely talented in applying SCD, CDC, and DQ/DV framework. • Familiar with JIRA & Confluence. • Must have exposure to technologies such as dbt, Apache airflow, and Snowflake. • Must have a strong Agile mindset, demonstrating adaptability to changing project needs, iterative development, and a proactive approach to problemsolving. Must be comfortable embracing feedback and continuously improving processes and products. • Desire to continually keep up with advancements in data engineering practices. • Knowledge of AWS cloud, and Python is a plus.

Posted 2 months ago

Apply

8 - 12 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Work and Technical Experience: Strong D&A background an understanding of cloud and data engineering concepts in AWS Design, develop, and optimize data models and solutions for various business use cases Build and maintain data pipelines, transformations, and integrations using best practices Develop and implement performance tuning strategies Create, maintain, and optimize ETL/ELT processes Competencies: ETL/ELT, Data Engineering, Cloud, Data Modeling Technologies: Snowflake, AWS, SQL, Python SAP background preferred, not a must-have In depth knowledge of key cloud services for data integration, BI and data processing services In depth knowledge of cloud storage services & compute services Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Working Knowledge of S3. Good Expertise in AWS services like Athena Tables, Glue Jobs etc. Strong understanding of Snowflake DW. Must Have Experience in AWS Cloud Service Proficient in SQL, including CTE and database table design, with the ability to write efficient queries on large data sets. Experienced in Snowflake or strong database experience, including SQL tuning. Skilled in developing and optimizing ETL processes using Snowflake and other tools. Create, maintain, and optimize ETL/ELT processes Experienced in data analysis and profiling tasks.

Posted 2 months ago

Apply

5 - 10 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Your opportunity Do you love the transformative impact data can have on a businessAre you motivated to push for results and overcome all obstaclesThen we have a role for you. New Relic is looking for a Senior Data Engineer to help grow our global engineering team. What youll do Lead the building of scalable, fault tolerant pipelines with built in data quality checks that transform, load and curate data from various internal and external systems Provide leadership to cross-functional initiatives and projects. Influence architecture design and decisions. Build cross-functional relationships with Data Scientists, Product Managers and Software Engineers to understand data needs and deliver on those needs. Improve engineering processes and cross-team collaboration Provide thought leadership to grow and evolve DE function and implementation of SDLC best practices in building internal-facing data products by staying up-to-date with industry trends, emerging technologies, and best practices in data engineering This role requires 5+ years of experience in BI and Data Warehousing. Experience and knowledge of building data-lakes in AWS (i.e. Spark/Glue, Athena), including data modeling, data quality best practices, and self-service tooling. Demonstrated success leading cross functional initiatives Passionate about data quality, code quality, SLAs and continuous improvement Deep understanding of data system architecture Deep understanding of ETL/ELT patterns Development experience in at least one object-oriented language (Python,R,Scala, etc.). Comfortable with SQL and related tooling Bonus points if you have Experience with dbt, Airflow and snowflake Experience with Apache Iceberg tables Data observability experience

Posted 2 months ago

Apply

2 - 4 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role As Data Engineer you will work on highly important data projects that will transform how engineering data is used on pharma projects. A few years ago, NNE embarked on a new data journey to build Azure-based data platform. As the dependency on data is only growing across NNE, we are looking for a new data engineer to support the efforts. You will become part of a strong team within NNE Global IT of 50+ employees and work in a cross-department data platform team of 10 (data engineers, data analysts and subject-matter experts). How You Might Spend Your Days As a Data Engineer, youll: Maintain NNEs data platform infrastructure in Microsoft Azure Help with data storage, processing and data modelling. Design ETL/ELT processes & build data pipelines Create PoCs and investigate new data technologies, fx Microsoft Fabric Collaborate with business stakeholders to help them get the most of their data Communicate how data can be used in new ways within NNE Prioritize support for operational tasks Spend remaining time on development tasks. Who You Are We care about who you are as a person. In the end, how you work, and your energy, is what impacts the results we achieve as a team. As a person, you: Encourage teamwork and collaboration across skillsets Understand how technology can empower the business, and can articulate the value of data solutions Are well-organized, can plan your work, and can communicate progress on activities Have analytical skills combined with technical understanding Have good cross-cultural understanding, and fluency in written and spoken English are essential the IT team is international and department meetings are conducted in English The miles youve walked In all positions there are some things that are needed, and others a bonus. We believe these qualifications are needed for you to do well in this role: Bachelors or masters degree in relevant field 2+ years of experience within data engineering Experience with SQL and Python Experience with ETL/ELT processes Hands-on experience with Azure services, Azure Synapse Analytics and/or Microsoft Fabric will be considered a plus. Knowledge of Kimballs data modelling principles will be considered a plus Prior working knowledge in Azure DevOps will be considered a plus

Posted 2 months ago

Apply

5 - 7 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Engineer will be a part of a global team that designs, develops, and delivers BI and Analytics solutions leveraging the latest BI and Analytics technologies for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch global services India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KGSI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch global services India (KGSI) over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. This role is a part of the INVISTA team within the KGSI. Our Team We are seeking a Data Engineer who will be responsible to develop and implement a future-state data analytics platform for both the back-end data processing and the front-end data visualization component for the Data and Analytics teams. This will be a hands-on role to build ingestion pipelines and the Datawarehouse What You Will Do Design, develop, enhance, and debug the existing SQL code in the existing and new data pipeline. Design, develop, enhance, and debug the existing SQL code. Implement best practices for high data availability, computational efficiency, cost-effectiveness, and data quality within Snowflake and AWS environments. Build and enhance environments, processes, functionalities, and tools to streamline all stages of data lake implementation and analytics solution development, including proof of concepts, prototypes, and production systems. Drive automation initiatives throughout the data lifecycle based on configuration-driven approaches, ensuring repeatability and scalability of processes. Stay updated with relevant technology trends and product updates, especially within AWS services, and incorporate them into the data engineering ecosystem. Collaborate with cross-functional teams following agile methodologies to ensure alignment with business objectives and best practices for data governance. Who You Are (Basic Qualifications) 5-7 years of professional experience in data engineering or Big data and data warehousing. Strong hands-on SQL and Data Modelling skills (must) and intermediate Python. Proficient in handling high volume of data and developing SQL queries which are scalable and performant. Proven experience in ETL/ELT concepts and methodologies. Proficiency in data analytics tasks such as data wrangling, mining, integration, analysis, visualization, data modelling, and reporting, using BI tools. Expertise in primary skills including Data warehousing, SQL. Excellent communication skills with the ability to effectively communicate complex technical concepts and drive initiatives within the team and across departments. Proven track record of contribution-driven work ethic, striving for excellence in every aspect of data engineering and software development. What Will Put You Ahead Have delivered BI project by working on any modern BI tools like Power BI, Tableau, Qlik Sense. Knowledge of Qlik Replicate and Denodo for data virtualization layer.

Posted 2 months ago

Apply

6 - 7 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Title: Data Engineer Job Description: A Data Engineer is responsible for designing, building, and maintaining large-scale data systems and infrastructure. Their primary goal is to ensure that data is properly collected, stored, processed, and retrieved to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities: 1. Design and Develop Data Pipelines: Create data pipelines to extract data from various sources, transform it into a standardized format, and load it into a centralized data repository. 2. Build and Maintain Data Infrastructure: Design, implement, and manage data warehouses, data lakes, and other data storage solutions. 3. Ensure Data Quality and Integrity: Develop data validation, cleansing, and normalization processes to ensure data accuracy and consistency. 4. Collaborate with Data Analysts and Business Process Owners: Work with data analysts and business process owners to understand their data requirements and provide data support for their projects. 5. Optimize Data Systems for Performance: Continuously monitor and optimize data systems for performance, scalability, and reliability. 6. Develop and Maintain Data Governance Policies: Create and enforce data governance policies to ensure data security, compliance, and regulatory requirements. Experience & Skills: 1. Hands-on experience in implementing, supporting, and administering modern cloud-based data solutions (Google BigQuery, AWS Redshift, Azure Synapse, Snowflake, etc.). 2. Strong programming skills in SQL, Java, and Python. 3. Experience in configuring and managing data pipelines using Apache Airflow, Informatica, Talend, SAP BODS or API-based extraction. 4. Expertise in real-time data processing frameworks. 5. Strong understanding of Git and CI/CD for automated deployment and version control. 6. Experience with Infrastructure-as-Code tools like Terraform for cloud resource management. 7. Good stakeholder management skills to collaborate effectively across teams. 8. Solid understanding of SAP ERP data and processes to integrate enterprise data sources. 9. Exposure to data visualization and front-end tools (Tableau, Looker, etc.). 10. Strong command of English with excellent communication skills.

Posted 2 months ago

Apply

4 - 8 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: An Individual Contributor who has worked with ERP systems as sources with sound knowledge of Dimensional Modeling, Data Warehousing, implementation & Extensions of Oracle Business Intelligence Applications / Fusion Data Intelligence (Fusion Analytics Warehouse) Experience in designing and development of data pipelines from variety of source systems into Data warehouse or lakehouse using ODI, Informatica Power Center or any other ETL/ELT technologies. Possess hands on experience to the Semantic modeling / metadata (RPD) modeling very well, developing, customizing, maintaining and support Complex Analysis, Data Visualizations and BI Publisher Reports in Oracle Analytics Cloud or Oracle Analytics Server as per requirement of the business users.

Posted 2 months ago

Apply

11 - 16 years

40 - 45 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title - IT Architect Specialist, AVP Location - Pune, India Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

2 - 6 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 2 months ago

Apply

3 - 6 years

5 - 8 Lacs

Udaipur

Work from Office

Naukri logo

Data Engineer Name:Data Engineer Role:Data EngineerIndustry:Software/ ITLocation:Udaipur (Rajasthan)Job Type:Full TimeExperience:3- 6yearsSkills:Data Engineer,ETL/ELT processes,data ingestion, cloud platforms,big data tools,CI/CD pipelinesSalary:Best in the industryEducation:BTech (CS/IT/EC), BCA, MCA Description: 3+ years of experience in a data engineering or related role, managing complex data workflows and large-scale datasets. Expertise in SQL, with the ability to write, optimize, and troubleshoot complex queries for relational databases like MySQL, PostgreSQL, Oracle, or Snowflake. Proficiency in Python, including libraries like Pandas, PySpark, and others used for data manipulation and processing. Experience with big data tools such as Apache Spark, Hadoop, or Kafka. Data Engineer. Familiarity with cloud platforms (AWS, Azure, GCP) and their data processing services (e.g., AWS Glue, Google BigQuery, Azure Data Factory). Strong understanding of data modeling, normalization, and schema design. Experience in using version control systems like Git for collaborative development. Knowledge of CI/CD pipelines for data workflows is a plus. Strong problem-solving skills with attention to detail and the ability to debug data issues efficiently.Design, build, and maintain scalable data pipelines and architectures for large-scale data processing. Develop and optimize ETL/ELT processes for data ingestion, transformation, and loading. Collaborate with data analysts and scientists to ensure data accessibility and usability. Monitor and improve data quality and system performance. Implement best practices for data governance, security, and compliance. Work with cross-functional teams to integrate data systems with existing and new technology stacks. Utilize distributed computing tools (e.g., Spark, Hadoop) and cloud platforms (AWS, Azure, GCP) for efficient data handling. Develop automated workflows for data validation and reporting.

Posted 2 months ago

Apply

0 - 2 years

2 - 4 Lacs

Udaipur

Work from Office

Naukri logo

Python Developer Name:Python DeveloperRole:Python DeveloperIndustry:IT/ SoftwareLocation:Udaipur(Rajasthan)Job Type:Full TimeExperience:Freshers - 2yearsSkills:Python, Database libraries,data architecture, and dimension modeling,ETL/ELT frameworks Salary:Best in the industryEducation:BTech ( CS/ IT/ EC) Description: Execution of data architecture and data management projects for both new and established data sources. Innovate and contribute to the development of client’s data platforms using Python.Familiarity with transitioning existing data sets and databases to new technology stack is helpful.Manage the end-to-end process for data ingestion and publishing.Perform data loads and data quality analysis to identify potential errors within the data platform.Work closely with operation teams to understand data flow, architecture, and gather functional requirements. Experience in a data production environment, with a focus on adeptly managing vast volumes of intricate data.Hands-on experience in SQL programming, data architecture, and dimension modeling.Expertise in Python programming, showcasing deep knowledge of libraries such as Beautiful Soup, Selenium, Requests, Pandas, data structures, and algorithms.Proficiency in crafting efficient, reusable, and modular code.In-depth knowledge of the RDBMS with the ability to design and optimize complex SQL queries.Relational database experience with MySql, PostGres, Oracle or Snowflake is preferred.Expertise in mapping, standardizing, and normalizing data.Knowledge of ETL/ELT frameworks and writing pipelines for loading millions of records is helpful.Use of version control systems like Git, effectively managing code repositories.Strong analytical skills for addressing complex technical challenges, including proficiency in debugging and performance optimization techniques. Showcase a thorough understanding of the software development lifecycle, from requirements analysis to testing and deployment.

Posted 2 months ago

Apply

1 - 3 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: Provide comprehensive consultation to business unit and IT management and staff on all phases of application testing and automation for diverse development platforms, computing environments (e.g., host based, distributed systems, client server, software, hardware, technologies and tools, etc.). Work closely with Business and IT management and staff to identify application automation solutions, new or modified programs, reuse of existing automation solutions through the use of program development software alternatives, or integration of purchased solutions or a combination of the available alternatives. Research and evaluate alternative solutions and recommend the most efficient and cost effective automation programming solution and tooling. Document, test, implement and provide on-going support for the automation solutions. Execute on a strategy to hand over the test automation to specific Agile teams for adoption and usage within their areas of focus. Exercise considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Use deep professional knowledge and acumen to advise functional leaders. Focus on providing thought leadership within Application Development while working on broader projects that require understanding of wider business. Qualifications Required Skills: Technology skillset required: Selenium, BDD, TDD, Java, Maven, Jenkins, Git, Cucumber Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Required Experience & Education: College degree (Bachelor) in related technical/business areas or equivalent work experience. 3+ years of experience with 1-3 years automation development experience working in a large corporate environment Desired Experience Experience working in an Agile framework. Technology skillset: Exposure to Angular, NodeJS, AWS, .Net, and ELT solutions Healthcare experience

Posted 2 months ago

Apply

5 - 8 years

14 - 20 Lacs

Pune

Remote

Naukri logo

Strong proficiency in Python,SQL ,AWS, Azure, or Google Cloud, ETL/ELT processes.

Posted 2 months ago

Apply

7 - 9 years

10 - 12 Lacs

Chennai

Hybrid

Naukri logo

About the Role: We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. In this role, you will be responsible for designing, developing, and maintaining scalable and efficient data pipelines and solutions on the Azure Databricks platform. You will leverage your expertise in PySpark, Python, and SQL to transform raw data into actionable insights, supporting critical business decisions. Responsibilities: Design, develop, and maintain robust and scalable data pipelines using PySpark and Python within the Azure Databricks environment. Optimize data processing and storage for performance and efficiency. Develop and implement data models and schemas for efficient data storage and retrieval. Write complex SQL queries to extract, transform, and load data from various sources. Implement data quality checks and monitoring to ensure data accuracy and reliability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Troubleshoot and resolve data-related issues in a timely manner. Implement and maintain CI/CD pipelines for data engineering workflows. Stay up-to-date with the latest data engineering technologies and best practices. Contribute to the design and implementation of data governance and security policies. Work with Azure services like Azure Data Lake Storage, Azure SQL Database, and Azure Data Factory. Performance tuning of Spark jobs. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on PySpark and Python. Extensive experience with Azure Databricks and its associated services. Strong proficiency in SQL, including writing complex queries and optimizing performance. Experience with data warehousing and ETL/ELT processes. Solid understanding of distributed computing concepts and architectures. Experience with version control systems (e.g., Git). Familiarity with CI/CD pipelines and DevOps practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Experience with Azure Data Lake Storage Gen2. Experience with Azure Data Factory. Knowledge of data modeling techniques. Experience with data quality and data governance concepts.

Posted 2 months ago

Apply

6 - 10 years

6 - 10 Lacs

Gurgaon

Work from Office

Naukri logo

Role: Azure Data Engineer (Databricks) Location: Gurugram Experience: 5 to 9 years Notice Period: Immediate to 30 days NTE Job Description: This requirement is for Data Engineer in Gurugram for Data Analytics Project. Building ETL/ELT pipelines of data from various sources using SQL/Python/Spark. Ensuring that data are modelled and processed according to architecture and requirements both functional and non-functional. Understanding and implementing required development guidelines, design standards and best practices. Delivering right solution architecture, automation and technology choices. Working cross-functionally with enterprise architects, information security teams, and platform teams. Suggesting and implementing architecture improvements. Experience with programming languages such as Python or Scala Knowledge of Data Warehouse, Business Intelligence and ETL/ELT data processing issues. Ability to create and orchestrate ETL/ELT processes in different tools (ADF, Databricks Workflows). Experience working with Databricks platform: workspace, delta lake, workflows, jobs, Unity Catalog. Understanding of SQL and relational databases. Practical knowledge of various relational and non-relational database engines in the cloud (Azure SQL Database, Azure Cosmos DB, Microsoft Fabric, Databricks). Hands-on experience with data services offered by Azure cloud Knowledge of Apache Spark (Databricks, Azure Synapse Spark Pools). Experience in performing code review of ETL/ELT pipelines and SQL queries. Analytical approach to problem solving.

Posted 2 months ago

Apply

1 - 2 years

2 - 5 Lacs

Karnataka

Work from Office

Naukri logo

EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different open data formats Delta, Iceberg, Hudi Ability to engage in technical conversations and suggest enhancements to the current Architecture and design"

Posted 2 months ago

Apply

4 - 9 years

0 Lacs

Mysore, Bengaluru, Hyderabad

Hybrid

Naukri logo

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Bangalore, Karnataka on 29th March [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 29th March [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 12 years Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience: 5 years to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies Ltd, Shanti Niketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Develop, operate, optimize, test, and maintain the businesss data warehouse, as well as develop ETL processes, cube development for database and performance administration, and dimensional designing of table structures Drive the full life-cycle of back-end development of the businesss data warehouse, as well as develop ETL processes, cube development for database and performance administration, and dimensional designing of table structures Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Define data retention policies Build analytical tools to utilize the data pipeline, providing actionable insights into key business performance metrics, including operational efficiency and customer acquisition Choose and integrate tools for monitoring, managing, alerting, and improving customer database performance Develop and implement automated processes that ensure uninterrupted updating and correction of database vulnerabilities Assemble large, complex sets of data that meet non-functional and functional business requirements Requirements 5+ years of experience or over 5 completed projects Knowledge of data management fundamentals (data modeling, ELT/ETL, data quality, metadata management, data warehouse/lakes patterns, distributed systems) AWS/Azure cloud Snowflake and Databricks are must-haves Strong proficiency with SQL and its variations among popular databases Experience with some modern relational databases Knowledge of the programming language Python

Posted 2 months ago

Apply

5 - 8 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Role: Senior Cloud Data Engineer Exp: 5+Yrs Loc: Hyderabad Skills:AWS,ELT,Data Pipelines, cloud-based data engineering,data warehousing,Python, SQL,Spark, Databricks, Snowflake, BigQuery. Mahesh 8125251748 3ghr19@gmail.com

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies