Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: AWS, Python, SQL, Airflow, DBT Must have done 1 or 2 projects in Clinical Domain/Clinical Industry. Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Show more Show less
Posted 3 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe. Job Title: SAP BODS (Data Migration) Location: Hyderabad Experience: 5+ yrs Job Type : Contract to hire Notice Period:- Immediate joiner Mandatory Skills 2-4 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 1-2 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenanc Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
🧾 Job Title: Application Developer – Data Engineering 🕒 Experience: 4–6 Years 📅 Notice Period: Immediate to 20 Days 🔍 Job Summary: We are looking for a highly skilled Data Engineering Application Developer to join our dynamic team. You will be responsible for the design, development, and configuration of data-driven applications that align with key business processes. Your role will also include refining data workflows, optimize performance, and supporting business goals through scalable and reliable data solutions. 📌 Roles & Responsibilities: Independently develop and maintain data pipelines and ETL processes. Become a Subject Matter Expert (SME) in Data Engineering tools and practices. Collaborate with cross-functional teams to gather requirements and provide data-driven solutions. Actively participate in team discussions and contribute to problem-solving efforts. Create and maintain comprehensive technical documentation, including application specifications and user guides. Stay updated with industry best practices and continuously improve application and data processing performance. 🛠️ Professional & Technical Skills: ✅ Must-Have Skills: Proficiency in Data Engineering , PySpark , and Python Strong knowledge of ETL processes and data modeling Experience working with cloud platforms like AWS or Azure Hands-on expertise with SQL or NoSQL databases Familiarity with other programming languages such as Java ➕ Good-to-Have Skills: Knowledge of Big Data tools and frameworks (e.g., Hadoop, Hive, Kafka) Experience with CI/CD tools and DevOps practices Exposure to containerization tools like Docker or Kubernetes #DataEngineering #PySpark #PythonDeveloper #ETLDeveloper #BigDataJobs #DataEngineer #BangaloreJobs #PANIndiaJobs #AWS #Azure #SQL #NoSQL #CloudDeveloper #ImmediateJoiners #DataPipeline #Java #Kubernetes #SoftwareJobs #ITJobs #NowHiring #HiringAlert #ApplicationDeveloper #DataJobs #ITCareers #JoinOurTeam #TechJobsIndia #JobOpening #FullTimeJobs Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Notice Period - Immediate Joiner to 30 Days Location - Mumbai/Pune/Bangalore/Delhi/Chennai Experience - 4 to 8 Years Roles & Responsibility Must have 4 + years of current experience & must understand BPC Standard & Embedded - Environment, Dimensions, Models, Input Forms, Report, Business Rules, Data Security, Work Status, Data Manager Packages, Planning Function & Sequence. Business Understanding of Standalone & Consol IndAS & IFRS Reporting, Intercompany Eliminations, Ownership & NCI. Must have done atleast 2 implementation of BPC consolidation. Should have worked on application design and all documentation till go live. Must have handled a team of 2-3 people on projects. Demonstrated ability to work independently as well as in a collaborative team environment. Experience with SAP BW/Group Reporting preferred. Experience with Data ETL, Transformation & Integration with BPC using SAP BW/ABAP/HANA. Creation of Logic Scripts, BADIs and Custom logics. Experience with tuning of BPC landscape desirable. Experience on EPM, Analysis for Office, SAC Reporting. Proficient in ITGC and ITAC related audit processes and ensuring adherence to technical control protocols. Change Management Expertise: Demonstrated capability to manage change effectively in large-scale, high-pressure environments, ensuring stakeholder buy-in for new processes. Eligibility criteria and requirements: Good communication skills both written and oral. Strong Education Background (CA) Must be a Team player (Raises issues/concerns and seeks meaningful resolution) Strong interpersonal skills Well-developed business acumen Strong problem-solving skills. Comfortable with travel and flexible to manage time-sensitive deliverables. Strong sense of ownership and accountability of tasks with attention to detail Show more Show less
Posted 3 days ago
2.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are looking for an experienced Data Engineer having experience in building large-scale data pipelines and data lake ecosystems. Our daily work is around solving interesting and exciting problems against high engineering standards. Even though you will be a part of the backend team, you will be working with cross-functional teams across the org. This role demands good hands-on different programming languages, especially Python, and the knowledge of technologies like Kafka, AWS Glue, Cloudformation, ECS, etc. You will be spending most of your time on facilitating seamless streaming, tracking, and collaborating huge data sets. This is a back-end role, but not limited to it. You will be working closely with producers and consumers of the data and build optimal solutions for the organization. Will appreciate a person with lots of patience and data understanding. Also, we believe in extreme ownership! Design and build systems to efficiently move data across multiple systems and make it available for various teams like Data Science, Data Analytics, and Product. Design, construct, test, and maintain data management systems. Understand data and business metrics required by the product and architect the systems to make that data available in a usable/queryable manner. Ensure that all systems meet the business/company requirements as well as best industry practices. Keep ourselves abreast of new technologies in our domain. Recommend different ways to constantly improve data reliability and quality. Bachelors/Masters, Preferably in Computer Science or a related technical field. 2-5 years of relevant experience. Deep knowledge and working experience of Kafka ecosystem. Good programming experience, preferably in Python, Java, Go, and a willingness to learn more. Experience in working with large sets of data platforms. Strong knowledge of microservices, data warehouse, and data lake systems in the cloud, especially AWS Redshift, S3, and Glue. Strong hands-on experience in writing complex and efficient ETL jobs. Experience in version management systems (preferably with Git). Strong analytical thinking and communication. Passion for finding and sharing best practices and driving discipline for superior data quality and integrity. Intellectual curiosity to find new and unusual ways of how to solve data management issues. Show more Show less
Posted 3 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are seeking a highly experienced AWS Data Solution Architect to lead the design and implementation of scalable, secure, and high-performance data architectures on the AWS cloud. The ideal candidate will have a deep understanding of cloud-based data platforms, analytics, and best practices for optimizing data pipelines and storage. You will work closely with data engineers, business stakeholders, and cloud architects to deliver robust data solutions. Key Responsibilities: 1. Architecture Design and Planning: Design scalable and resilient data architectures on AWS that include data lakes, data warehouses, and real-time processing. Architect end-to-end data solutions leveraging AWS services such as S3, Redshift, RDS, DynamoDB, Glue, and Lake Formation. Develop multi-layered security frameworks for data protection and governance. 2. Data Pipeline Development: Build and optimize ETL/ELT pipelines using AWS Glue, Data Pipeline, and Lambda. Integrate data from various sources like RDBMS, NoSQL, APIs, and streaming platforms. Ensure high availability and real-time processing capabilities for mission-critical applications. 3. Data Warehousing and Analytics: Design and optimize data warehouses using Amazon Redshift or Snowflake. Implement data modeling, partitioning, and indexing for optimal performance. Create analytical models to drive business insights and data-driven decision-making. 4. Real-time Data Processing: Implement real-time data processing using AWS Kinesis, Kafka, or MSK. Architect solutions for event-driven architectures with Lambda and EventBridge. 5. Security and Compliance: Implement best practices for data security, encryption, and access control using IAM, KMS, and Lake Formation. Ensure compliance with regulatory standards like GDPR, HIPAA, and CCPA. 6. Monitoring and Optimization: Monitor performance, optimize costs, and enhance the reliability of data pipelines and storage. Set up observability with AWS CloudWatch, X-Ray, and CloudTrail. Troubleshoot issues and ensure business continuity with automated recovery mechanisms. 7. Documentation and Best Practices: Create detailed architecture diagrams, data flow mappings, and documentation for reference. Establish best practices for data governance, architecture design, and deployment. 8. Collaboration and Leadership: Work closely with data engineers, application developers, and DevOps teams to ensure seamless integration. Act as a technical advisor to business stakeholders for cloud-based data solution Regulatory Compliance Reporting Experience The architect should be able to resolve complex challenges due to the strict regulatory environment in India and the need to balance compliance with operational efficiency. Key complexities include: a) Building data segregation and Access Control capability: This requires in-depth understanding of data privacy laws, Amazon’s global data architecture, and the ability to design systems that can segregate and control access to sensitive payment data without compromising functionality. b) Integrating diverse data sources into Secure Redshift Cluster (SRC) data which involves working with multiple teams and systems, each with its own data structure and transfer protocols. c) Instrumenting additional UPI data elements collaborating with UPI tech teams and a deep understanding of UPI transaction flows to ensure accurate and compliant data capture. d) Automating Law Enforcement Agency (LEA) and Financial Intelligence Unit (FIU) reporting: This involves creating secure, automated pipelines for highly sensitive data, ensuring accuracy and timeliness while meeting strict regulatory requirements. The Architect will be extending from India-specific solutions to serving worldwide markets. Complexities include: a) Designing a unified data storage and compute architecture requiring harmonizing diverse tech stacks and data logging practices across multiple countries while considering data sovereignty laws and cost implications of cross-border data transfers. b) Setting up comprehensive datamarts covering metrics and dimensions involving standardizing metric definitions across markets, ensuring data consistency, and designing for scalability to accommodate future growth. c) Enabling customer segmentation across power-up programs that requires integrating data from diverse programs while maintaining data integrity and respecting country-specific data usage regulations. d) Managing time zone challenges :Synchronizing data across multiple time zones requires innovative solutions to ensure timely data availability without compromising completeness or accuracy. e) Navigating regulatory complexities: Designing systems that comply with varying and evolving data regulations across multiple countries while maintaining operational efficiency and flexibility for future changes. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
ABOUT THE JOB You will be responsible for taking care of: Lead/mentor data analysts, collaborate with stakeholders all sorts of P&L & product related ad-hoc analysis and investigations, hypothesis generation and validation for any movement in the P&L metric driving end to end analytics and insights to help team take data-driven decisions collaborate across functions like product, marketing, design, growth, strategy, customer relations and technology. What are we looking for? Must have SQL coding skills and Advanced SQL knowledge Must have Data visualization / ETL tool – Tableau / Power BI / SSIS / BODS Must have expertise in MS Excel - VLookup, HLookup, Pivots, Solver and Data Analysis Must have experience in Statistics and analytical capabilities Good to have Python (pandas, NumPy) or R Good to have Machine Learning knowledge & Predictive Modelling Good to have AWS Qualifications and Skills Bachelors or Masters in Technology or any other related field Minimum 5+ years of experience in the analytics role Show more Show less
Posted 3 days ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company:IT Services Organization Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT. Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures. Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments. Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions. Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub. Independently design and execute innovative ETL and reporting solutions that align with business and operational goals. Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps. Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance). Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies. Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows. Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations. Troubleshoot post-deployment production issues and deliver timely resolutions. Experience Requirements: 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture. Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization. Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control. Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality. Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime. Experience aligning data engineering practices with data governance and compliance standards. Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations. Strong ability to communicate technical details clearly across teams and stakeholders. Education: Any Post Graduation, Any Graduation. Show more Show less
Posted 3 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join our digital revolution in NatWest Digital X In everything we do, we work to one aim. To make digital experiences which are effortless and secure. So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter. Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a Client Analytics Associate Take on a new challenge in Data & Analytics and help us shape the future of our business You’ll be helping to manage the analysis of complex data to identify business issues and opportunities, and supporting the delivery of high quality business solutions We're committed to mapping a career path that works for you, with a focus on helping you build new skills and engage with the latest ideas and technologies in data analytics We're offering this role at associate level What you'll do As a Data & Analytics Analyst, you’ll be planning and providing high quality analytical input to support the development and implementation of innovative processes and problem resolution. You’ll be capturing, validating and documenting business and data requirements, making sure they are in line with key strategic principles. We’ll look to you to interrogate, interpret and visualise large volumes of data to identify, support and challenge business opportunities and identify solutions. You’ll also be: Performing data extraction, storage, manipulation, processing and analysis Conducting and supporting options analysis, identifying the most appropriate solution Helping to maintain full traceability and linkage of business requirements of analytics outputs Seeking opportunities to challenge and improve current business processes, ensuring the best result for the customer Creating and executing quality assurance at various stages of the project in order to validate the analysis and to ensure data quality, identify data inconsistencies, and resolve as needed The skills you'll need You’ll need a background in business analysis tools and techniques, along with the ability to influence through communications tailored to a specific audience. Additionally, you’ll need the ability to use core technical skills. You’ll also demonstrate: Strong analytic and problem solving abilities A keen eye for detail in your work Strong proficiency in T-SQL (writing complex queries, stored procedures, view, functions) using SQL Server Experience with SSIS (SQL Server Integration Services), building and maintaining ETL pipelines Experience in designing and developing interactive Tableau dashboard and reports, ability to translate business requirements into effective visualizations Show more Show less
Posted 3 days ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Hiring for USA based big Multinational Company (MNC) We are seeking a highly skilled Automation Test Engineer to design, develop, and execute automated test scripts to ensure the quality and reliability of our software applications. The ideal candidate will have hands-on experience with automation frameworks, scripting languages, and a strong understanding of software QA methodologies. This role involves close collaboration with developers, QA analysts, and product teams to build robust testing solutions and deliver high-quality software. Design and implement automated test scripts and frameworks for web, mobile, or API testing. Develop reusable and maintainable test automation code using tools such as Selenium, Appium, Cypress, Postman, TestNG, JUnit, etc. Integrate automated tests into CI/CD pipelines using tools like Jenkins, GitLab CI, or Azure DevOps. Work closely with developers and QA teams to identify test scenarios and improve test coverage. Execute and monitor automated test suites, analyze results, and report defects clearly. Maintain existing automated tests and troubleshoot failures effectively. Participate in code reviews and contribute to the continuous improvement of testing practices. Collaborate with stakeholders to ensure business requirements are clearly understood and testable. Proven experience in software test automation and scripting Proficiency in at least one programming language: Java, Python, JavaScript, or C# Solid understanding of QA methodologies, SDLC, and STLC Experience with one or more test automation tools/frameworks: Web: Selenium WebDriver, Cypress, Mobile: Appium, API: Postman, REST Assured
Posted 3 days ago
125.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Through bold discovery and cutting-edge innovation, we lead an industry that is vital for the future of our planet: lighting. Through our leadership in connected lighting and the Internet of Things, we're breaking new ground in data analytics, AI, and smart solutions for homes, offices, cities, and beyond. At Signify, you can shape tomorrow by building on our incredible 125+ year legacy while working toward even bolder sustainability goals. Our culture of continuous learning, creativity, and commitment to diversity and inclusion empowers you to grow your skills and career. Join us, and together, we’ll transform our industry, making a lasting difference for brighter lives and a better world. You light the way. More About The Role What you’ll do Develop Cloud native micro-services for Connected Lighting Solutions. Implement modular code to bring software design to life. Work with an Agile team to realise product features. Interact with product management to understand requirements and translate to implementation. Your Qualifications 10 years building data-intensive applications and pipelines, that involve concepts like ETL. Proficient with core Java. Good to have - Practical experience with frameworks like Springboot Demonstrates capabilities in API Design. Has demonstrated ownership for features and modules in their projects including feature and module design, practices such as code reviews, unit tests, code coverage and build sanity. Has integrated with databases such as Postgres/MySQL. Good to have - practical experience with NoSQL databases Demonstrates capabilities in building cloud native solutions on cloud platforms such as AWS/GCP/Azure. Has practical understanding of application security. Everything we’ll do for you You can grow a lasting career here. We’ll encourage you, support you, and challenge you. We’ll help you learn and progress in a way that’s right for you, with coaching and mentoring along the way. We’ll listen to you too, because we see and value every one of our 30,000+ people. We believe that a diverse and inclusive workplace fosters creativity, innovation, and a full spectrum of bright ideas. With a global workforce representing 99 nationalities, we are dedicated to creating an inclusive environment where every voice is heard and valued, helping us all achieve more together. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital transformation, technology consulting, and business process outsourcing, Business Focus on Digital Engineering, Cloud Services, AI and Data Analytics, Enterprise Applications ( SAP, Oracle, Sales Force ), IT Infrastructure, Business Process Out Source. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru. Offices in over 35 countries. India is a major operational hub, with as its U.S. headquarters. Job Title : Python with ETL Testing Key Skills : Python, ETL Testing, MYSQL, Sql, Oracle Job Locations : Hyderabad, Pune, Bangalore, Chennai Experience : 5+ Years. Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job Description: 5 to 10 years of experience in relevant areas At least 3+ years of working knowledge in Python At least 2+ years of hands - on experience in data testing ETL Testing At least 2+ years of hands-on experience in Database like MYSQL,SQL Server, Oracle Must be well versed with Agile Methodology Hands-on Framework creation/ improvising test frameworks for automation. Experience in creating self-serve tools - Good experience working with Robot Framework. Should be able to work with git. Demonstrated knowledge of any of the CI/CD (gitlab, jenkins. ) Demonstrated knowledge of RDBMS, and SQL queries. Show more Show less
Posted 3 days ago
162.0 years
0 Lacs
Greater Hyderabad Area
On-site
Area(s) of responsibility About Birlasoft Birlasoft, a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. We take pride in our consultative and design thinking approach, driving societal progress by enabling our customers to run businesses with unmatched efficiency and innovation. As part of the CKA Birla Group, a multibillion-dollar enterprise, we boast a 12,500+ professional team committed to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our dedication to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. About the Job –The Azure Data Architect is responsible for designing, implementing and managing scalable and secure data solutions on Microsoft Azure cloud platform. This role requires a deep understanding of data transformation, data cleansing, data profiling, data architecture principles, cloud technologies and data engineering practices helping build an optimal data ecosystem for performance, scalability and cost efficiency. Job Title - Azure Data Factory Architect Location: Noida/Pune Educational Background: Bachelor’s degree in computer science, Information Technology, or related field. Mode of Work- Hybrid Experience Required - 14+ years Mandatory skills Key Responsibilities Solution Design: Design and implement robust, scalable and secure data architectures on Azure Define end-to-end data solutions, including data ingestion, storage, transformation, processing, and analytics. Understand business requirements and translate them to technical solutions Azure Platform Expertise: Leverage Azure services like Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks, Azure Cosmos DB, and Azure SQL Database. Knowledge on optimization and cost of Azure Data Solutions Data Integration and ETL/ELT pipelines: Design and implement data pipeline for real-time and batch processing SQL skills to write complex queries Must have knowledge in establishing a one way or two-way communication channels while integration between various systems. Data Governance and Security: Implement data security in line with organizational and regulatory requirements. Implement data quality assurance. Should have knowledge in different Authentication methods used in Cloud solutions Performance Optimization: Monitor, troubleshoot and improve data solution performance Implement best practice for data solutions Collaboration and Leadership: Provide technical leadership and mentorship to team members Mandatory Skills Required Hands on experience in Azure services like Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage Gen2, Azure Keyvault Services, Azure SQL Database, Azure Databricks. Hands on experience in data migration/data transformation. Data cleansings. Data profiling Experience in Logic Apps Soft Skills Communicates effectively Problem solving – analytical skills Adapt evolving technologies Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: SAS Monitoring Specialist Experience Level: 4 Years Location: Hyderabad Job Type: Full-time Job Summary: We are seeking a skilled and detail-oriented SAS Monitoring Specialist with 4 years of hands-on experience in SAS environments. The ideal candidate will be responsible for monitoring, maintaining, and optimizing SAS platforms to ensure continuous performance, availability, and data integrity. You will work closely with IT, data engineering, and analytics teams to ensure smooth operations of all SAS systems and processes. Key Responsibilities: Monitor SAS servers and environments (SAS 9.4, SAS Grid, Viya) for performance, stability, and capacity. Analyze logs and system alerts to proactively identify potential issues and resolve them promptly. Manage and troubleshoot scheduled SAS jobs and batch processes. Support daily health checks, user access issues, and performance tuning. Collaborate with SAS Admins and Infrastructure teams to manage upgrades, patches, and migrations. Automate monitoring tasks using scripts (Shell, Python, or SAS-based). Create dashboards and reports to track system performance and job success/failure rates. Document system procedures, incidents, and resolution steps. Maintain compliance with internal policies and external regulations regarding data usage and security. Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. 4+ years of experience in SAS monitoring or administration. Strong knowledge of SAS tools (SAS 9.4, Viya, SAS Management Console, Enterprise Guide). Experience with SAS job scheduling tools like LSF, Control-M, or similar. Familiarity with operating systems (Linux/UNIX/Windows) and system-level monitoring. Proficiency in scripting languages for automation (Shell, Python, PowerShell, or SAS Macros). Solid understanding of performance tuning and root cause analysis. Excellent problem-solving and communication skills. Preferred Skills: Experience with cloud-based SAS platforms (AWS, Azure). Understanding of data integration and ETL processes in SAS. Knowledge of monitoring tools like Splunk, Nagios, or Prometheus. ITIL certification or knowledge of ITSM tools (ServiceNow, BMC Remedy). Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
India
On-site
Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Responsibilities Build and maintain the infrastructure for data generation, collection, storage, and processing. Design, build, and maintain scalable data pipelines to support data flows from various sources to data warehouses and analytics platforms. Develop and manage ETL (Extract, Transform, Load) processes to ensure data is accurately transformed and loaded into the target systems. Design and optimize databases, ensuring performance, security, and scalability of data storage solutions. Integrate data from various internal and external sources into unified data systems for analysis. Work with big data technologies (e.g., Hadoop, Spark) to process and manage large volumes of structured and unstructured data. Implement and manage cloud-based data solutions using Azure & Fabric platforms. Ensure data quality by developing validation processes and monitoring for anomalies and inconsistencies. Work closely with data scientists, analysts, and other stakeholders to meet their data needs and ensure smooth data operations. Automate repetitive data processes and workflows to improve efficiency and reduce manual effort. Implement and enforce data security protocols, ensuring compliance with industry standards and regulations. Optimize data queries and system performance to handle large data sets efficiently. Create and maintain clear documentation of data pipelines, infrastructure, and processes for transparency and training. Set up monitoring tools to ensure data systems are functioning smoothly and troubleshoot any issues that arise. Stay updated with emerging trends and tools in data engineering and continuously improve data infrastructure. Qualifications Azure Solution Architect certification preferred Microsoft Fabric Analytics Engineer Associate certification preferred 5+ years of architecture experience in the technology Operations/Development using Azure technologies. Strong experience in Python and PySpark required Strong understanding and experience in building lake houses, data lakes and data warehouses Strong experience in Microsoft Fabric technologies. Good understanding of the Scrum Agile methodology Strong experience with Azure Cloud technologies Solid knowledge of SQL, and non-relational (NoSQL) databases Solid knowledge of networking, firewalls, load balancers etc. Exceptional communication skills and the ability to communicate appropriately with technical teams Familiarity with at least one of the following code build/deploy tools Azure DevOps or GitHub Actions Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC And Its Subsidiaries And Its Affiliates (collectively, “Orion,” “we” Or “us”) Are Committed To Protecting Your Privacy. This Candidate Privacy Policy (orioninc.com) (“Notice”) Explains What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy. Show more Show less
Posted 3 days ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Company Description Logikview Technologies Pvt. Ltd. is a forward-thinking data analytics services firm. As a strategic partner, we provide a comprehensive range of analytics services to our clients' business units or analytics teams. From setting up big data and analytics infrastructure to performing data transformations and building advanced predictive analytical engines, Logikview supports clients throughout their analytics journey. We offer ready-to-deploy productized analytics solutions in domains such as retail, telecom, education, and healthcare. Role Description We are seeking a full-time Technical Lead for an on-site role in Indore. As a Technical Lead, you will oversee a team of engineers, manage project timelines, and ensure the successful delivery of analytics solutions. Day-to-day tasks include designing and implementing data models, developing and optimizing data pipelines, and collaborating with cross-functional teams to address technical challenges. You will also be responsible for code reviews, mentoring team members, and staying updated with the latest technological advancements. Qualifications Proficiency in data modeling, data warehousing, and ETL processes Experience with big data technologies such as Hadoop, Spark, and Kafka Knowledge of programming languages like Python, Java, and SQL Strong understanding of machine learning algorithms and predictive analytics Excellent problem-solving skills and the ability to troubleshoot technical issues Proven experience in team leadership and project management Bachelor's or Master's degree in Computer Science, Information Technology, or a related field Relevant certifications in data analytics or big data technologies are a plus Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Kerala, India
On-site
Hiring: Senior Data Analyst (5+ Years) – Kochi/Trivandrum/Bangalore/Chennai 📍 Location: Kochi / Trivandrum 💰 Budget: Up to 19 LPA 📆 Immediate Joiners Preferred 🚀 About the Role: We’re looking for a Senior Data Analyst to join our Data & Analytics team! You’ll transform complex data into actionable insights, drive strategic decisions, and empower stakeholders with intuitive dashboards and reports. If you love digging into data, solving business problems, and communicating insights effectively, this role is for you! 🔧 Mandatory Key Skills Required: 5 years mandatory ✔ SQL (Advanced) ✔ Power BI (Dashboarding & Visualization) ✔ Python (Data Analysis) ✔ Amazon Athena (or similar cloud data tools) ✔ 5+ years in Data Analysis/Business Intelligence Job Description / Duties & Responsibilities • Collaborate with business stakeholders to understand data needs and translate them into analytical requirements. • Analyze large datasets to uncover trends, patterns, and actionable insights. • Design and build dashboards and reports using Power BI. • Perform ad-hoc analysis and develop data-driven narratives to support decision-making. • Ensure data accuracy, consistency, and integrity through data validation and quality checks. • Build and maintain SQL queries, views, and data models for reporting purposes. • Communicate findings clearly through presentations, visualizations, and written summaries. • Partner with data engineers and architects to improve data pipelines and architecture. • Contribute to the definition of KPIs, metrics, and data governance standards. Job Specification / Skills and Competencies • Bachelor’s or Master’s degree in Statistics, Mathematics, Computer Science, Economics, or a related field. • 5+ years of experience in a data analyst or business intelligence role. • Advanced proficiency in SQL and experience working with relational databases (e.g., SQL Server, Redshift, Snowflake). • Hands-on experience in Power BI. • Proficiency in Python, Excel and data storytelling. • Understanding of data modelling, ETL concepts, and basic data architecture. • Strong analytical thinking and problem-solving skills. • Excellent communication and stakeholder management skills • To adhere to the Information Security Management policies and procedures. Soft Skills Required Must be a good team player with good communication skills Must have good presentation skills Must be a pro-active problem solver and a leader by self Manage & nurture a team of data engineers Show more Show less
Posted 3 days ago
175.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? American Express’ Internal Audit Group (IAG) has reinvented our audit process and is leading the financial services industry with our Data-Driven Continuous Auditing methodology embedding intelligence through the audit lifecycle. IAG’s strategic initiatives, combined with our greatest asset – our people – enable IAG to utilize advanced data analysis capabilities, provide greater and continuous assurance, forward looking risk insights, and help ensure quality products and services are provided to American Express customers. IAG Analytics & Insights team is looking for those who share our mission and aspirations and are passionate about the use of data and technology in a collaborative, people and risk-focused environment. We are looking for a dynamic leader to drive our Data Management and Business Intelligence (BI) agenda. This role will combine strategic vision with hands-on execution to build and optimize data pipelines, BI solutions, and analytic systems that empower decision-making for the department & enterprise at large Key Responsibilities: Leadership and Strategy · Lead and mentor a cross-functional team of BI developers, engineers, and project managers. · Define and execute the data and BI strategy, aligning with business priorities. · Partner with business stakeholders to prioritize and deliver impactful analytics solutions. Project Management · Manage the full lifecycle of BI and analytic projects, including scoping, planning, resource allocation, and timeline management. · Ensure projects are delivered on time, within scope and budget, with clear reporting to leadership. Solution Development · Guide the development and scaling of data pipelines, reporting systems, and BI tools. · Ensure solutions are high-performing, user-friendly, and adhere to data governance standards · Support cloud migrations including integration of BI and Machine Learning tools for analytic development & production solutions · Provide leadership & oversight for development & deployment of analytic solutions (including advanced analytics) across Audit portfolios Enablement & Adoption · Serve as a bridge between business users and technical teams · Promote adoption of BI solutions through training, support, and change management · Drive process improvement and automation within BI workflows Governance and Compliance · Implement and enforce data governance and data quality standards to ensure data integrity and security. · Oversee the development and adherence to best practices for data access, reporting, and compliance with industry regulations. Qualifications Bachelor’s degree in Computer Science, Information Technology, Business Administration, or a related field. MBA or advanced degrees preferred. 10+ years of experience in data and business intelligence, with at least 5 years in a leadership or managerial role. Experience with cloud data platforms (AWS, Azure, Google Cloud). Strong expertise in BI tools (e.g., Power BI, Tableau, Qlik), automation solutions and data modeling techniques. Experience with data integration, ETL processes, and data warehousing concepts. Proven ability to design and implement end-to-end BI solutions and data architectures. Experience managing cross-functional teams and driving organizational change. Expertise in data governance, security, and compliance best practices. Excellent communication and interpersonal skills, with the ability to engage with both technical teams and business stakeholders. Project management experience and familiarity with Agile methodologies. Strong problem-solving and analytical skills, with a focus on delivering actionable insights from complex data. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Exp:5+yrs NP: Imm-15 days Rounds: 3 Rounds (Virtual) Mandate Skills: Apache spark, hive, Hadoop, spark, scala, Databricks Job Description The Role Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Requirements 4+ years of experience as a Data Engineer. Strong proficiency in SQL. Hands-on experience with modern cloud data warehousing solutions (Snowflake, Big Query, Redshift) Expertise in ETL/ELT processes, batch, and streaming data processing. Proven ability to troubleshoot data issues and propose effective solutions. Knowledge of AWS services (S3, DMS, Glue, Athena). Familiarity with DBT for data transformation and modeling. Must be fluent in English communication. Desired Experience Experience with additional AWS services (EC2, ECS, EKS, VPC, IAM). Knowledge of Infrastructure as Code (IaC) tools like Terraform and Terragrunt. Proficiency in Python for data engineering tasks. Experience with orchestration tools like Dagster, Airflow, or AWS Step Functions. Familiarity with pub-sub, queuing, and streaming frameworks (AWS Kinesis, Kafka, SQS, SNS). Experience with CI/CD pipelines and automation for data processes. Skills: sns,data,snowflake,terraform,data engineer,big query,redshift,sqs,dagster,ci,etl,aws step functions,elt,cd,python,aws kinesis,dms,s3,cloud,airflow,ci/cd,dbt,glue,terragrunt,kafka,sql,athena,aws Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Gandhinagar, Gujarat, India
On-site
Requirements 4+ years of experience as a Data Engineer. Strong proficiency in SQL. Hands-on experience with modern cloud data warehousing solutions (Snowflake, Big Query, Redshift) Expertise in ETL/ELT processes, batch, and streaming data processing. Proven ability to troubleshoot data issues and propose effective solutions. Knowledge of AWS services (S3, DMS, Glue, Athena). Familiarity with DBT for data transformation and modeling. Must be fluent in English communication. Desired Experience Experience with additional AWS services (EC2, ECS, EKS, VPC, IAM). Knowledge of Infrastructure as Code (IaC) tools like Terraform and Terragrunt. Proficiency in Python for data engineering tasks. Experience with orchestration tools like Dagster, Airflow, or AWS Step Functions. Familiarity with pub-sub, queuing, and streaming frameworks (AWS Kinesis, Kafka, SQS, SNS). Experience with CI/CD pipelines and automation for data processes. Skills: sns,data,snowflake,terraform,data engineer,big query,redshift,sqs,dagster,ci,etl,aws step functions,elt,cd,python,aws kinesis,dms,s3,cloud,airflow,ci/cd,dbt,glue,terragrunt,kafka,sql,athena,aws Show more Show less
Posted 3 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. The strategy of GTS focuses on aligning goals and promoting collaboration across all organizational areas. This includes fostering shared ownership, developing flexible platforms that can easily scale to meet global demands, and implementing global standards for key processes to ensure efficiency and consistency. Role Overview As a Data Engineering Lead, you will be responsible for overseeing and guiding the data engineering team in developing, optimizing, and maintaining our data infrastructure. You will play a critical role in ensuring the seamless integration and flow of data across the organization, enabling data-driven decision-making and analytics. Key Responsibilities Data Integration: Coordinate with various teams to ensure seamless data integration across the organization's systems. ETL Processes: Develop and implement efficient data transformation and ETL (Extract, Transform, Load) processes. Performance Optimization: Optimize data flow and system performance for enhanced functionality and efficiency. Data Security: Ensure adherence to data security protocols and compliance standards to protect sensitive information. Infrastructure Management: Oversee the development and maintenance of the data infrastructure, ensuring scalability and reliability. Collaboration: Work closely with data scientists, analysts, and other stakeholders to support data-driven initiatives. Innovation: Stay updated with the latest trends and technologies in data engineering and implement best practices. Qualifications Experience: Proven experience in data engineering, with a strong background in leading and managing teams. Technical Skills: Proficiency in programming languages such as Python, Java, and SQL, along with experience in big data technologies like Hadoop, Spark, and Kafka. Data Management: In-depth understanding of data warehousing, data modeling, and database management systems. Analytical Skills: Strong analytical and problem-solving skills with the ability to handle complex data challenges. Communication: Excellent communication and interpersonal skills, capable of working effectively with cross-functional teams. Education: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Why Join Us? Work on cutting-edge data projects and contribute to the organization's data strategy. Collaborative and innovative work environment that values creativity and continuous learning. If you are a strategic thinker with a passion for data engineering and leadership, we would love to hear from you. Apply now to join our team and make a significant impact on our data-driven journey. Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant. Show more Show less
Posted 3 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Company Our client is a trusted global innovator of IT and business services. We help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients’ long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe Job Title: Jr BODS Developer Location: Hyderabad Experience: 5+ yrs Job Type : Contract to hire Notice Period:- Immediate joiner Jr SAP BODS Developer Position Overview Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements 2-4 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 1-2 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenanc Qualifications Bachelor's degree in Computer Science (or related field) Show more Show less
Posted 3 days ago
9.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS!!! TCS is hiring for Big Data Architect Location - PAN India Years of Experience - 9-14 years Job Description- Experience with Python, Spark, and Hive data pipelines using ETL processes Apache Hadoop development and implementation Experience with streaming frameworks such as Kafka Hands on experience in Azure/AWS/Google data services Work with big data technologies (Spark, Hadoop, BigQuery, Databricks) for data preprocessing and feature engineering. Show more Show less
Posted 3 days ago
7.0 - 20.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings from TCS!! TCS is Hiring for Data Solution Architect Interview Mode: Virtual Required Experience: 7-20 years Work location: PAN INDIA Must have: Design and manage ETL pipelines on cloud platforms (preference for AWS) Utilize tools like Airflow to orchestrate tasks and Cloud Watch to manage notifications Collaborate with cross-functional teams to enhance data-driven decision making, ensuring alignment of machine learning projects with our strategic business goals Develop containerized applications to improve data accuracy, accessibility, and reliability through customized Python solutions Contribute to data governance workflow, improving data quality, standardizing definitions, training, and onboarding data stewards to own their teams’ KPIs Pilot next-generation technologies solving problems traditional programming struggles with by utilizing Gen AI or other machine learning techniques Key Skills/Knowledge: Excellent hands-on knowledge on Python Proven experience in data engineering, with a strong focus on machine learning operations Proficiency in developing ETL pipelines and architecting big data solutions Expertise in one cloud platform and a proven record of working on at least one project end to end Strong collaboration skills, with the ability to work effectively across diverse teams A passion for innovation, driven by a desire to push the boundaries of what's possible with technology in an ambiguous environment If interested kindly send your updated CV and below mentioned details through DM/E-mail: srishti.g2@tcs.com Name: E-mail ID: Contact Number: Highest qualification(Fulltime): Preferred Location: Highest qualification university: Current organization: Total, years of experience: Relevant years of experience: Any gap: Mention-No: of months/years (career/ education): If any then reason for gap: Is it rebegin: Previous organization name: Current CTC: Expected CTC: Notice Period: Have you worked with TCS before (Permanent / Contract): Show more Show less
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.