Home
Jobs

10660 Etl Jobs - Page 30

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At NXP innovation is in our DNA; every year we spend ~2B$ on R&D (~13.000 Engineers). The NPI tracker is a tool used to monitor and manage New Product Introduction (NPI) projects, ensuring that all tasks and milestones are on track. It provides visibility into project progress, budget, customer traction, resource allocation, requirements, test results and potential risks, helping teams to make informed decisions. The Business Analyst gathers and analyzes data, identifies trends, and provide insights to support decision-making processes. The Business Analyst will work closely with various stakeholders to ensure the successful implementation and continuous improvement of the NPI tracker. The main responsibility is the development and maintenance of Power BI reports that delight users: easy to understand, showing relevant business insights that lead to action. The Business Analyst will continuously improve the quality of the dashboards and the adoption of it. The ideal candidate has a passion for data and the high tech (semiconductor) industry, is technical savvy, likes to improve continuously and has a strong drive to deliver results. Key Responsibilities Data Analysis: Collect, analyze, and interpret data related to R&D projects to identify trends, issues, and opportunities for improvement. Reporting: Develop and maintain standard reports and dashboards to provide visibility into project progress, risks, and performance metrics. Stakeholder Collaboration: Work with project managers, resource managers, finance & strategy managers, IT teams, and other stakeholders to gather requirements, define project scope, and ensure alignment with business objectives. Process Improvement: Identify and recommend process improvements to enhance the efficiency and effectiveness of the NPI tracker. Documentation: Create and maintain comprehensive documentation, including business requirements, process flows, and user guides. Support: Provide ongoing support and training to users of the NPI tracker, addressing any issues or questions that arise. Qualifications Master’s degree in Computer Science, Information Systems, Business Administration, or a related field Proven technical savviness and data literacy Excellent data transformation and visualization skills in Power BI Desktop, Power BI Service, Power Query and DAX Proficient in databases, ETL, SQL and data modeling Knowledge in AWS, programming languages like python is a pre Strong analytical and problem-solving skills Excellent communication and collaboration skills Affinity with high technology fast-paced, dynamic semiconductor industry Preferred Skills Understanding of program management data, e.g. project schedule, resource allocation, time writing, requirements & test data, business case data. Familiarity with data governance frameworks and methodologies Experience with agile way of working and cross-functional team environment Hands-on experience in software development best practices (CI/CD), version control, including release management, testing and documentation About The CTO Office CTO Office is a small team (~30 people) that specialized on “R&D Craftmanship”. CTO office drives R&D efficiency & collaboration in NXP with ~13000 engineers through the following focus areas: Transparent programming, planning & cost-allocation Harmonized Processes, Methods and Tools NXP R&D Improvement & Strategic Programs ‘State of the art’ Analytics and Reporting, Technical Leadership and Program Management Culture More information about NXP in India... Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Dehradun, Uttarakhand, India

On-site

Linkedin logo

Cynoteck is currently hiring a Salesforce Technical Lead with an excellent interpersonal communication skills that has the relevant experience, knowledge and skillset. Key Responsibilities: Lead the end-to-end technical design, architecture, and implementation of Salesforce solutions. Collaborate with functional teams to understand business requirements and translate them into scalable and maintainable Salesforce solutions. Provide technical leadership and mentorship to Salesforce developers, guiding them through best practices and development challenges. Design and implement custom Salesforce applications, including complex workflows, process builders, flows, triggers, and integrations with third-party systems.Ensure adherence to Salesforce development standards and best practices. Lead Salesforce system upgrades, patches, and new feature releases, ensuring minimal disruption to operations. Manage data migration and integration strategies, including integration with other internal and external systems. Oversee testing strategies and ensure that all deliverables meet the required quality standards. Stay current with Salesforce updates, new features, and industry best practices, and evaluate their relevance to the business. Required Skills & Qualifications: Ability to work in a fast-paced, agile environment.Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of experience working with Salesforce, including hands-on development experience in the Salesforce platform. Strong understanding of Salesforce architecture, data model, and capabilities. Excellent problem-solving, analytical, and troubleshooting skills. Experience leading and mentoring development teams. Expertise in Salesforce development tools: Apex, Visualforce, Lightning Web Components (LWC), SOQL, and Salesforce APIs. Experience with Salesforce integrations using REST/SOAP APIs, Middleware, or ETL tools.Proficient in Salesforce declarative configuration (Flows, Process Builder, Workflow Rules, etc.). Experience in deploying Salesforce changes using Salesforce DX, CI/CD processes, and change management tools. Strong understanding of security concepts in Salesforce (profiles, permission sets, roles, sharing rules). Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Hands-on experience with Salesforce Lightning, including Lightning Components and Lightning Experience is preferred. Salesforce Certifications (e.g., Salesforce Platform Developer are highly preferred. Show more Show less

Posted 2 days ago

Apply

2.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job: Project Scientist III (Data Analyst) Apply here Project title : NDMC Phase II: Developing Models to Estimate and Project Disease Burden to Inform Control and/or Elimination Strategies for Priority Diseases in India About the project: IIT Bombay is the anchor organization of the National Disease Modelling Consortium (NDMC). The consortium partners with various institutions in the country for disease modelling work. The objective of the project is to address policy and programmatic questions through India-specific disease models, to improve disease control and intervention strategies in the country. More information about the consortium can be found at www.ndmconsortium.com Essential Qualifications & Experience: PhD in data sciences with minimum 2 years relevant experience / MSc data sciences with 5 years experience OR PhD in Computer Engineering or any other engineering / MTech Data science, Computer science, Statistics or related field with minimum 3 years relevant experience/ BTech/BE/BDes or equivalent engineering degree with minimum 5 years of experience Desirable experience At least 8-10 years of experience in data science, with a proven track record in managing and supervising data science teams. Strong proficiency in R and Python for data analysis and modeling, along with extensive SQL skills for database management. Comprehensive understanding of Azure/cloud-based server and database development, with demonstrated experience in implementing cloud solutions Exceptional analytical abilities and familiarity with statistical techniques relevant to data interpretation. Experience with data visualization tools, such as Tableau and Power BI. Strong leadership, critical thinking, and problem-solving skills, with the ability to work collaboratively across teams. Job Profile: Lead and mentor a team of data scientists, facilitating collaboration, knowledge sharing, and professional growth to achieve project goals. Oversee the design and implementation of efficient database schemas and structures, ensuring optimal performance and scalability while maintaining data integrity and security. Guide the development, maintenance, and optimization of databases that support data storage and retrieval processes, including executing complex SQL queries for data manipulation. Drive the development of advanced data analysis and modeling, utilizing programming languages such as R and Python. Manage ETL (Extract, Transform, Load) processes to ensure data is prepared for analysis and reporting efficiently. Hands on experience working with data systems and platforms such as Azure Data Factory, Azure Storage, Data Lake, Azure Synapse, Databricks or similar Solid understanding of Cloud based data management, data analytics principles and tools Promote the use of data visualization tools (such as Tableau and Power BI) to communicate effectively and enhance understanding across the organization. Clearly communicate complex methodologies, findings, and analytical results to multidisciplinary teams and external partners. Contribute to the preparation of comprehensive research reports and presentations. Pay Details: Consolidated salary 78,000+HRA (if applicable) p.m. Show more Show less

Posted 2 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Position Title: Data Scientist Location: Gurugram Experience: 3–4 Years Job Type: Full-Time Company: Sequifi | www.sequifi.com About the Role We are hiring a Data Scientist with 3–4 years of experience who has a strong foundation in data analysis, machine learning, and business problem-solving. This role is ideal for someone who is hands-on with modern tools and techniques, eager to explore new technologies, and enjoys working in a collaborative, fast-paced environment. Key Responsibilities Analyze complex datasets to uncover patterns, trends, and actionable insights. Build, validate, and deploy machine learning models for predictive analytics, classification, and clustering. Design and maintain efficient data pipelines and ETL processes. Create clear, interactive dashboards and reports using tools such as Power BI, Tableau, or Python visualization libraries. Collaborate with product managers, developers, and business analysts to understand requirements and deliver data-driven solutions. Conduct A/B testing and statistical analysis to support product decisions and optimizations. Continuously improve model performance based on feedback and business objectives. Required Qualifications Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, or a related field. 3–4 years of experience in data science or a similar role. Strong programming skills in Python (Pandas, NumPy, Scikit-learn), SQL, and familiarity with PHP or Node.js. Hands-on experience with data visualization tools such as Power BI or Tableau. Good understanding of machine learning algorithms, data preprocessing, and feature engineering. Experience working with structured and unstructured data, including NoSQL databases like MongoDB. Familiarity with cloud platforms such as AWS, GCP, or Azure is a plus. Soft Skills Strong analytical and problem-solving abilities. Effective communication and data storytelling skills. Ability to work independently as well as collaboratively in cross-functional teams. A mindset geared toward innovation, learning, and adaptability. Why Join Us Work on meaningful and challenging problems in a tech-focused environment. Join a young, supportive, and fast-moving team. Gain exposure to a combination of data science, product, and engineering. Opportunity to learn and grow continuously in a culture of innovation. If you’re passionate about using data to drive business impact, we’d love to hear from you. Apply now and grow with us at Sequifi. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Company Description NXP Semiconductors enables secure connections and infrastructure for a smarter world, advancing solutions that make lives easier, better and safer. As the world leader in secure connectivity solutions for embedded applications, we are driving innovation in the secure connected vehicle, end-to-end security & privacy and smart connected solutions markets. Organization Description Do you feel challenged by being part of the IT department of NXP, the company with a mission of “Secure Connections for a Smarter World”? Do you perform best in a role representing IT in projects in a fast moving, international environment? Within R&D IT Solutions, the Product Creation Applications (PCA) department is responsible for providing and supporting the R&D design community globally with best-in-class applications and support. The applications are used by over 6,000 designers. Job Summary As a Graph Engineer, you will: Develop pipelines and code to support the ingress and egress of this data to and from the knowledge graphs. Perform basic and advanced graph querying and data modeling on the knowledge graphs that lie at the heart of the organization's Product Creation ecosystem. Maintain the (ETL) pipelines, code and Knowledge Graph to stay scalable, resilient and performant in line with customer’s requirements. Work in an international and Agile DevOps environment. This position offers an opportunity to work in a globally distributed team where you will get a unique opportunity of personal development in a multi-cultural environment. You will also get a challenging environment to develop expertise in the technologies useful in the industry. Primary Responsibilities Translate requirements of business functions into “Graph-Thinking”. Build and maintain graphs and related applications from data and information, using latest graph technologies to leverage high value use cases. Support and manage graph databases. Integrate graph data from various sources – internal and external. Extract data from various sources, including databases, APIs, and flat files. Load data into target systems, such as data warehouses and data lakes. Develop code to move data (ETL) from the enterprise platform applications into the enterprise knowledge graphs. Optimize ETL processes for performance and scalability. Collaborate with data engineers, data scientists and other stakeholders to model the graph environment to best represent the data coming from the multiple enterprise systems. Skills / Experience Semantic Web technologies: RDF RDFS, OWL, SHACL SPARQL JSON-LD, N-Triples/N-Quads, Turtle, RDF/XML, TriX API-led architectures REST, SOAP Microservices API Management Graph databases, such as Dydra, Amazon Neptune, Neo4J, Oracle Spatial & Graph is a plus Experience with other NoSQL databases, such as key-value databases and document-based databases (e.g. XML databases) is a plus Experience with relational databases Programming experience, preferably Java, JavaScript, Python, PL/SQL Experience with web technologies: HTML, CSS, XML, XSLT, XPath Experience with modelling languages such as UML Understanding of CI/CD automation, version control, build automation, testing frameworks, static code analysis, IT service management, artifact management, container management, and experience with related tools and platforms. Familiarity with Cloud computing concepts (e.g. in AWS and Azure). Education & Personal Skillsets A master’s or bachelor’s degree in the field of computer science, mathematics, electronics engineering or related discipline with at least 10 plus years of experience in a similar role Excellent problem-solving and analytical skills A growth mindset with a curiosity to learn and improve. Team player with strong interpersonal, written, and verbal communication skills. Business consulting and technical consulting skills. An entrepreneurial spirit and the ability to foster a positive and energized culture. You can demonstrate fluent communication skills in English (spoken and written). Experience working in Agile (Scrum knowledge appreciated) with a DevOps mindset. More information about NXP in India... Show more Show less

Posted 2 days ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Title ETL/DWT Test Lead Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->E testing/Datawarehouse Testing Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements BTECH/BE/MTECH/ME/BCA/MCA/BSC/MSC

Posted 2 days ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Role : Data Engineer Employment Type : Full time Timing : General Work Mode : Work from Office Experience: 4 - 8 Years Location : Ahmedabad Notice period: Only immediate joiner and Only serving noticer ( Join before 30 June 2025) Role and Responsibilities: • Provide business analytics support to the Management team • Analyse business results and manage studies to collect relevant data • Design, build, and maintain data pipelines and ETL processes using Python as part of larger data platform projects • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data quality. • Optimize database performance, including indexing, partitioning, and query optimization. • Implement data governance and security measures to protect sensitive data. • Monitor data pipelines, troubleshoot issues, and perform data validation. • Develop and maintain documentation for data processes and workflows. Skills Required: • Proficiency in Python for data processing and scripting. • Strong SQL knowledge and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server) Understanding of data modelling, data warehousing, and data architecture. • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. • Proficiency in working with GCP (especially Big Query and GCS). • Version control skills using Git. MUST HAVE : Data Engineeing GCP (Big query, GCS, Data flow, Airflow) Python SQL (MYSQL / Prostgres SQL /SQL) Data Modelling Data warehousing Data Architect Git Show more Show less

Posted 2 days ago

Apply

18.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Job Summary: We’re looking for a strategic and technically strong Senior Data Architect to join our high-growth digital team. The selected person will play a critical role in shaping the company’s global data architecture and vision. The ideal candidate will lead enterprise-level architecture initiatives, collaborate with engineering and business teams, and guide a growing team of engineers and QA professionals. This role involves deep engagement across domains including Marketing, Product, Finance, and Supply Chain, with a special focus on marketing technology and commercial analytics relevant to the CPG/FMCG industry. The candidate should bring a hands-on mindset, a proven track record in designing scalable data platforms, and the ability to lead through influence. An understanding of industry-standard frameworks (e.g., TOGAF), tools like CDPs, MMM platforms, and AI-based insights generation will be a strong plus. Curiosity, communication, and architectural leadership are essential to succeed in this role. Key Responsibilities Enterprise Data Strategy: Design, define and maintain a holistic data strategy & roadmap that aligns with corporate objectives and fuels digital transformation. Ensure data architecture and products aligns with enterprise standards and best practices Data Governance & Quality: Establish scalable governance frameworks to ensure data accuracy, privacy, security, and compliance (e.g., GDPR, CCPA). Oversee quality, security and compliance initiatives Data Architecture & Platforms: Oversee modern data infrastructure (e.g., data lakes, warehouses, streaming) with technologies like Snowflake, Databricks, AWS, and Kafka Marketing Technology Integration: Ensure data architecture supports marketing technologies and commercial analytics platforms (e.g., CDP, MMM, ProfitSphere) tailored to the CPG/FMCG industry Architectural Leadership: Act as a hands-on architect with the ability to lead through influence. Guide design decisions aligned with industry best practices and e.l.f.'s evolving architecture roadmap Cross-Functional Collaboration: Partner with Marketing, Supply Chain, Finance, R&D, and IT to embed data-driven practices and deliver business impact. Lead integration of data from multiple sources to unified data warehouse. Cloud Optimization : Optimize data flows, storage for performance and scalability. Lead data migration priorities, manage metadata repositories and data dictionaries. Optimise databases and pipelines for efficiency. Manage and track quality, cataloging and observability AI/ML Enablement: Drive initiatives to operationalize predictive analytics, personalization, demand forecasting, and more using AI/ML models. Evaluate emerging data technologies and tools to improve data architecture Team Leadership: Lead, mentor, and enable high-performing team of data engineers, analysts, and partners through influence and thought leadership Vendor & Tooling Strategy: Manage relationships with external partners and drive evaluations of data and analytics tools Executive Reporting: Provide regular updates and strategic recommendations to executive leadership and key stakeholders Data Enablement : Design data models, database structures, and data integration solutions to support large volumes of data Qualifications And Requirements Bachelor's or Master's degree in Computer Science, Information Systems, or a related field 18+ years of experience in Information Technology 8+ years of experience in data architecture, data engineering, or a related field, with a focus on large-scale, distributed systems Strong understanding of data use cases in the CPG/FMCG sector. Experience with tools such as MMM (Marketing Mix Modeling), CDPs, ProfitSphere, or inventory analytics preferred Awareness of architecture frameworks like TOGAF. Certifications are not mandatory, but candidates must demonstrate clear thinking and experience in applying architecture principles Must possess excellent communication skills and a proven ability to work cross-functionally across global teams. Should be capable of leading with influence, not just execution Knowledge of data warehousing, ETL/ELT processes, and data modeling Deep understanding of data modeling principles, including schema design and dimensional data modeling Strong SQL development experience including SQL Queries and stored procedures Ability to architect and develop scalable data solutions, staying ahead of industry trends and integrating best practices in data engineering Familiarity with data security and governance best practices Experience with cloud computing platforms such as Snowflake, AWS, Azure, or GCP Excellent problem-solving abilities with a focus on data analysis and interpretation Strong communication and collaboration skills Ability to translate complex technical concepts into actionable business strategies Proficiency in one or more programming languages such as Python, Java, or Scala This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared. Show more Show less

Posted 2 days ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Pune, Chennai

Work from Office

Naukri logo

Mandatory Skill: 1. Spark 2. SQL 3. Python JD: Must Have: • Relevant experience of 5-8yrs as a Data Engineer. • Preferred experience in related technologies as follows: • SQL: 2-4 years of experience • Spark: 1-2 years of experience • NoSQL Databases: 1-2 years of experience • Database Architecture: 2-3 years of experience • Cloud Architecture: 1-2 years of experience • Experience in programming language like Python • Good Understanding of ETL (Extract, Transform, Load) concepts • Good analytical and problem-solving skills • Inclination for learning & be self-motivated. • Knowledge of ticketing tool like JIRA/SNOW. • Good communication skills to interact with Customers on issues & requirements. Good to Have: • Knowledge/Experience in Scala.

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

TCS Hiring !!! Role: Google Data Engineer Experience: 3-5 years Location: Hyderabad & Chennai Job Description: Experience level of 3 to 5 years of relevant experience in data engineering, data warehousing, or a related field. Experience with dashboarding tools like plx dashboard and looker studio Experience with building data pipelines, reports, best practices and frameworks. Experience with design and development of scalable and actionable solutions (dashboards, automated collateral, web applications). Experience with code refactoring for optimal performance. Experience writing and maintaining ETLs which operate on a variety of structured and unstructured sources. Familiarity with non-relational data storage systems (NoSQL and distributed database management systems). Strong proficiency in SQL, NoSQL, ETL tools, Big Query and at least one programming language (e.g., Python, Java). Strong understanding of data structures, algorithms, and software design principles. Experience with data modeling techniques and methodologies. Proficiency in troubleshooting and debugging complex data-related issues. Ability to work independently and as part of a team. Experience Cloud Storage or equivalent cloud platforms Knowledge of Big Query ingress and egress patterns Experience in writing Airflow DAGs Knowledge of pub sub, dataflow or any declarative data pipeline tools using batch and streaming ingestion Other GCP Services: Vertex AI Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Company : Our client is a global IT, consulting, and business process services company headquartered in Bengaluru, India. It offers end-to-end IT services, including application development, infrastructure management, and digital transformation. They serves clients across industries such as banking, healthcare, retail, energy, and manufacturing. It specializes in modern technologies like cloud computing, AI, data analytics, and cybersecurity. The company has a strong global presence, operating in over 66 countries. Our client employs more than 250,000 people worldwide. It is known for helping enterprises modernize their IT infrastructure and adopt agile practices. Their division includes consulting, software engineering, and managed services. The company integrates automation and AI into its services to boost efficiency and innovation. Job Title: Datastage developer · Location: Pune(Hybrid) · Experience: 6+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: DataStage Developer Responsibilities: Reviewing and discussing briefs with key personnel assigned to projects. Designing and building scalable DataStage solutions. Configuring clustered and distributed scalable parallel environments. Updating data within repositories, data marts, and data warehouses. Assisting project leaders in determining project timelines and objectives. Monitoring jobs and identifying bottlenecks in the data processing pipeline. Testing and troubleshooting problems in ETL system designs and processes. Improving existing ETL approaches and solutions used by the company. Providing support to customers about issues relating to the storage, handling, and access of data. DataStage Developer Requirements: Bachelor's degree in computer science, information systems, or a similar field. Demonstrable experience as a DataStage developer. IBM DataStage certification or similar type of qualification. Proficiency in SQL or another relevant coding language. Experience or understanding of other ETL tools, such as Informatica, Oracle ETL, or Xplenty. Knowledge of data modeling, database design, and the data warehousing ecosystem. Skilled at the ideation, design, and deployment of DataStage solutions. Excellent analytical and problem-solving skills. The ability to work within a multidisciplinary team. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are hiring for one the IT big4 consulting Designation: - Associate/Associate Consultant Location : - Chennai/Gurgaon/Pune Skills Req-: AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Key Responsibilities: - Work with business stakeholders to understand their business needs. - Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. - Clean, filter, and validate data to ensure it meets quality and format standards. - Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. - Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. - Optimize data pipelines and data storage for performance and efficiency. - Participate in code reviews and provide meaningful feedback to other team members. - Provide technical support and troubleshoot issue(s). Qualifications: - Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience. - Experience working in the AWS cloud platform. - Data engineer with expertise in developing big data and data warehouse platforms. - Experience working with structured and semi-structured data. - Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. - Experience working directly with technical and business teams. - Able to create technical documentation. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Skillset (good to have) - Experience in data modeling. - Certified in AWS platform for Data Engineer skills. - Experience with ITSM processes/tools such as ServiceNow, Jira - Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Proficiency in AI tools used to prepare and automate data pipelines and ingestion Apache Spark, especially with MLlib PySpark and Dask for distributed data processing Pandas and NumPy for local data wrangling Apache Airflow – schedule and orchestrate ETL/ELT jobs Google Cloud (BigQuery, Vertex AI) Python (most popular for AI and data tasks) Show more Show less

Posted 2 days ago

Apply

6.0 - 11.0 years

15 - 20 Lacs

Pune

Hybrid

Naukri logo

Role & responsibilities B.Tech or M.Tech in Computer Science, or equivalent experience. 5+ years of experience working professionally as a Python Software Developer. Organized, self-directed, and resourceful. Excellent written and verbal communication skills. Expert in python & pandas. Experience in building data pipelines, ETL and ELT processes. Advanced working SQL experience working with relational databases, query authoring (SQL) and working familiarity with a variety of databases. Understanding of Docker and Data Orchestration tools. Experience with Jupyter notebooks.

Posted 2 days ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Key Skills: SQL, ETL Roles & Responsibilities: Develop and maintain data solutions using ETL tools, SQL, and at least one programming or reporting technology (Python, Java, React, MSTR, Tableau, or PowerBI). Work on cloud platforms like AWS or any other public cloud environment for data operations and analytics. Debug and resolve complex incidents reported by clients, ensuring SLA compliance. Prepare and present status updates during client meetings and address queries. Collaborate with operations teams to optimize performance and ensure operational stability. Provide technical mentorship and guidance to team members; foster a high-performance culture. Perform task management, monitor team deliverables, and track SLA performance. Utilize tools such as ServiceNow and SharePoint for documentation and workflow management. Follow the Software Development Life Cycle (SDLC) to ensure structured delivery of solutions. Lead and coach a team of 10-15 members effectively. Encourage a learning mindset within the team and explore new domains or technologies. Demonstrate strong attention to detail and commitment to quality. Exhibit excellent communication, documentation, and stakeholder management skills. Expirence Requirements: 6-10 years of experience in IT with a focus on data analytics and development. Experience working with AWS or any public cloud platforms. Hands-on experience with at least one programming or visualization tool (Python/Java/React/MSTR/Tableau/PowerBI). Proficiency in SQL and database tools is mandatory. Experience with ETL processes and tools. Familiarity with ServiceNow and SharePoint. Solid understanding of SDLC practices. Experience in leading or mentoring a team of 10-15 members. Strong planning, organizing, and task management abilities. Excellent verbal and written communication skills. Ability to work independently and collaboratively in a fast-paced environment. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pimpri Chinchwad, Maharashtra, India

On-site

Linkedin logo

Job Title: Support Specialist – Eagle Platform (Portfolio Management) Location: Riyadh, Saudi Arabia Type: Full-time / Contract Industry: Banking / Investment Management / FinTech Experience Required: 5+ years We are seeking a highly skilled Support Specialist with hands-on experience working on BNY Mellon’s Eagle Investment Systems , particularly the Eagle STAR, PACE, and ACCESS modules used for portfolio accounting, data management, and performance reporting . The ideal candidate will have supported the platform in banking or asset management environments, preferably with experience at Bank of America , BNY Mellon , or institutions using Eagle for middle- and back-office operations . Key Responsibilities Provide day-to-day technical and functional support for the Eagle Platform including STAR, PACE, and Performance modules Troubleshoot and resolve user issues related to portfolio accounting, performance calculation, and reporting Act as a liaison between business users and technical teams for change requests, data corrections, and custom reports Monitor batch jobs, data feeds (security, pricing, transaction data), and system interfaces Work closely with front-office, middle-office, and operations teams to ensure accurate data processing and reporting Manage SLA-driven incident resolution and maintain support documentation Support data migrations, upgrades, and new release rollouts of Eagle components Engage in root cause analysis and implement preventive measures Required Skills And Experience 5+ years of experience in financial systems support, with a strong focus on Eagle Investment Systems Strong knowledge of portfolio management processes, NAV calculations, and financial instruments (equities, fixed income, derivatives) Prior work experience in Bank of America, BNY Mellon, or with asset managers using Eagle is highly preferred Proficient in SQL, ETL tools, and understanding of data architecture in financial environments Familiarity with upstream/downstream systems such as Bloomberg, Aladdin, or CRD is a plus Strong analytical skills and attention to detail Excellent communication skills in English (Arabic is a plus) Preferred Qualifications Bachelor’s degree in Computer Science, Finance, or related field ITIL Foundation or similar certification in service management Prior experience working in a banking or asset management firm in the GCC is a bonus Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Key Skills & Qualifications: Strong experience in SAP BW (preferably BW/4HANA) – data modeling, extraction, transformation, and loading (ETL). Hands-on experience in SAP Analytics Cloud (SAC) – building stories, dashboards, and predictive analytics. Proficiency in integrating BW data into SAC and managing data connections. Solid understanding of HANA views, CDS views, and ABAP for BW enhancements. Good knowledge of SAP ECC/S4HANA data sources and business processes. Experience in Agile delivery methodology is a plus. Strong analytical, problem-solving, and communication skills. Ability to work independently and in cross-functional teams. Preferred Qualifications: SAP BW and/or SAC certification. Prior experience in a client-facing delivery or consulting role. Experience with BOBJ, Analysis for Office, or other reporting tools is a plus. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Ciena is committed to our people-first philosophy. Our teams enjoy a culture focused on prioritizing a personalized and flexible work environment that empowers an individual’s passions, growth, wellbeing and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. How You Will Contribute As the CISO & Executive Metrics and Reporting Analyst , you will report directly to the Chief Information Security Officer (CISO) and play a pivotal role in shaping and communicating the security posture of the organization. You will be responsible for developing and managing a comprehensive security metrics and reporting framework that supports executive decision-making and regulatory compliance. Key Responsibilities Define, track, and analyze key performance and risk indicators (KPIs/KRIs) aligned with security goals and frameworks (e.g., NIST, ISO 27001). Deliver regular and ad-hoc executive-level reports and dashboards that translate complex security data into actionable insights. Collect and analyze data from SIEM systems, security tools, and incident reports to support risk management and strategic planning. Collaborate with IT, compliance, and business units to align on metrics and reporting requirements. Continuously improve reporting processes and stay current with cybersecurity trends and best practices. The Must Haves Education: Bachelor’s degree in Computer Science, Information Systems, Cybersecurity, or a related field. A Master’s degree is a plus. Experience: Minimum 5 years in cybersecurity metrics and reporting, preferably in an executive-facing role. Experience with data visualization tools (e.g., Power BI, Tableau, Excel). Familiarity with SIEM systems (e.g., Splunk) and cybersecurity frameworks (e.g., NIST, ISO 27001). Proficiency in SQL and experience with Snowflake for data warehousing.: Strong analytical skills with the ability to interpret complex data sets. Experience with ETL processes and Python scripting is a plus. Excellent written and verbal communication skills, with the ability to present to non-technical stakeholders. Assets Relevant certifications such as CISSP, CISM, or CRISC. Experience working in cross-functional teams and influencing stakeholders. Strategic thinking and adaptability to evolving security threats and technologies. Strong attention to detail and a proactive approach to problem-solving. Passion for continuous improvement and innovation in cybersecurity reporting. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 2 days ago

Apply

7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Role: GCP Database Migration Lead Required Technical Skill Set: GCP Database Migration Lead (Non-Oracle) Desired Experience Range: 8-10 yrs Location of Requirement: Kolkata/Delhi Notice period: Immediately Job Description: 7+ years of experience in database engineering or administration 3+ years of experience leading cloud-based database migrations , preferably to GCP Deep knowledge of traditional RDBMS (MS SQL Server, MariaDB, Oracle, MySQL) Strong hands-on experience with GCP database offerings (Cloud SQL, Spanner, Big Query, Fire store, etc.) Experience with schema conversion tools and strategies (e.g., DMS, SCT, custom ETL ) Solid SQL expertise and experience with data profiling, transformation, and validation Familiarity with IaC tools like Terraform and integration with CI/CD pipelines Strong problem-solving and communication skills. Show more Show less

Posted 2 days ago

Apply

15.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

HCL Job Level : DGM - Data Management (Centre of Excellence) Domain : Multi Tower Role : Center of Excellence (Data Management) Role Location : Hyderabad , (Noida or Chennai secondary location). Positions : 1 Experience : 15+ years Job Profile Support Global Shared Services Strategy for Multi Tower Finance (P2P, O2C, R2R and FP&A) and Procurement tracks. Understand all processes in a detailed manner, inter-dependence, current technology landscape and organization structure Ensure end-to-end data lifecycle management including ingestion, transformation, storage, and consumption, while maintaining data reliability, accuracy, and availability across enterprise systems, with a strong focus on the Enterprise Data Platform (EDP) as the central data repository Collaborate with cross-functional teams to understand data requirements, identify gaps, and implement scalable solutions Define and enforce data quality standards, validation rules, and monitoring mechanisms, while leading the architecture and deployment of scalable, fault-tolerant, and high-performance data pipelines to ensure consistent and trustworthy data delivery Partner with IT and business teams to define and implement data access controls, ensuring compliance with data privacy and security regulations (e.g., GDPR, HIPAA Understand Governance and Interaction models with Client SMEs and drive discussions on project deliverables. Collaborate with business stakeholders to define data SLAs (Service Level Agreements) and ensure adherence through proactive monitoring and alerting Act as a bridge between business and IT, translating business needs into technical solutions and ensuring alignment with strategic goals Establish and maintain metadata management practices, including data lineage, cataloging, and business glossary development Propose feasible solutions, both interim and long term, to resolve the problem statements and address key priorities. Solutioning must be at a strategic level and at L2/ L3 Level Drive Alignment of processes, people, technology & best practices thereby enabling optimization, breaking silos, eliminating redundant methods and standardizing processes and Controls across entire engagement, on Data management. Identify process variations across regions and businesses and evaluate standardization opportunities through defining the Golden processes of Data collection and Data management. Required Profile/ Experience Deep understanding of all Finance towers and Procurement Strong understanding of data management principles, data architecture, and data governance Understanding and Hands-on experience with data integration tools, ETL/ELT processes, and cloud-based data platforms Demonstrate a proven track record in managing tool integrations and ensuring accurate, high-performance data flow, with strong expertise in data quality frameworks, monitoring tools, performance optimization techniques, and a solid foundation in data modeling, metadata management, and master data management (MDM) concepts Leadership Capability – should have relevant leadership experience in running large delivery operations and driving multiple enterprise level initiatives and Programs with High Business Impact. BPO Experience : Desired candidates should have relevant experience in BPO services especially in Americas. Transformation: Should have led and delivered at least 2-3 Data transformation Project regarding Application Integrations & Master Data management Tools and Industry Benchmarks – Should have knowledge of Industry wide trends on F&A Tools, platforms and benchmarks. (Azure Data Lake, AWS, GCP) Customer Facing skills: Should be proficient in leading meetings and presentations with customers using powerful product level material. Education Requirement B.E./B. Tech/MCA or equivalent in Computer Science, Information Systems, or related field Certifications in data management tools or platforms (e.g., Informatica, Talend, Azure Data Engineer, etc.) are preferred Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits At Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking an experienced Data Engineer who is having strong problem solving and analytical skills, high attention to detail, passion for analytics, real-time data, and monitoring and critical Thinking and collaboration skills. The candidate should be a self-starter and a quick learner, ready to learn new technologies and tools that the job demands. Responsibilities Building automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Performing analysis on core products to support migration planning and development. Working closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Competencies, Characteristics And Traits Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum of 2 years in ETL/ELT Pipelines. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience working with API sources and destinations. Essential Skills and Experience Strong experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum 2 years in ETL/ELT Pipelines Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable Experience working with no-SQL databases like MongoDB Experience with Snaplogic administration is preferable Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! Show more Show less

Posted 2 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead and coordinate the delivery of data-centric projects. This role bridges the gap between engineering teams and business stakeholders, ensuring the successful execution of technical initiatives, particularly in data infrastructure, pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits. Show more Show less

Posted 2 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies