Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer with 7-12 years of experience, you will be an integral part of our team, contributing significantly to the design, development, and maintenance of our data infrastructure. Your primary responsibilities will revolve around creating and managing robust data architectures, ETL processes, data warehouses, and utilizing big data and cloud technologies to support our business intelligence and analytics needs. You will lead the design and implementation of data architectures that facilitate data warehousing, integration, and analytics platforms. Developing and optimizing ETL pipelines will be a key aspect of your role, ensuring efficient processing of large datasets and implementing data transformation and cleansing processes to maintain data quality. Your expertise will be crucial in building and maintaining scalable data warehouse solutions using technologies such as Snowflake, Databricks, or Redshift. Additionally, you will leverage AWS Glue and PySpark for large-scale data processing, manage data pipelines with Apache Airflow, and utilize cloud platforms like AWS, Azure, and GCP for data storage, processing, and analytics. Establishing data governance and security best practices, ensuring data integrity, accuracy, and availability, and implementing monitoring and alerting systems are vital components of your responsibilities. Collaborating closely with stakeholders, mentoring junior engineers, and leading data-related projects will also be part of your role. Furthermore, your technical skills should include proficiency in ETL tools like Informatica Power Center, Python, PySpark, SQL, RDBMS platforms, and data warehousing concepts. Soft skills such as excellent communication, leadership, problem-solving, and the ability to manage multiple projects effectively will be essential for success in this role. Preferred qualifications include experience with machine learning workflows, certification in relevant data engineering technologies, and familiarity with Agile methodologies and DevOps practices. Location: Hyderabad Employment Type: Full-time,
Posted 17 hours ago
4.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
At PwC, the focus in risk and compliance is on maintaining regulatory compliance and managing risks for clients by providing advice and solutions. The goal is to help organizations navigate complex regulatory landscapes and enhance internal controls to mitigate risks effectively. As part of the enterprise risk management team at PwC, you will be responsible for identifying and mitigating potential risks that could impact an organization's operations and objectives. Your role will involve developing business strategies to effectively manage and navigate risks in a rapidly changing business environment. Your primary focus will be on building meaningful client connections, learning how to manage and inspire others, and growing your personal brand. You will navigate complex situations, deepen your technical expertise, and become more aware of your strengths. Anticipating the needs of your teams and clients, delivering quality, and embracing ambiguity are key aspects of this role. You should be comfortable when the path forward isn't clear, ask questions, and view such moments as opportunities for growth. To lead and deliver value effectively at this level, you should possess a range of skills, knowledge, and experiences, including but not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others. - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems. - Employing critical thinking to break down complex concepts. - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy. - Developing a deeper understanding of the business context and its evolving nature. - Using reflection to enhance self-awareness, strengths, and development areas. - Interpreting data to derive insights and recommendations. - Upholding professional and technical standards, the Firm's code of conduct, and independence requirements. As a Senior Associate at PwC Acceleration Centers (ACs), you will play a pivotal role in supporting various services, from Advisory to Assurance, Tax, and Business Services. Engaging in challenging projects and providing distinctive services to support client engagements will be part of your responsibilities. You will also participate in dynamic and digitally enabled training to enhance your technical and professional skills. In the OFRO - QA team, you will be responsible for maintaining the quality and accuracy of dashboards and data workflows through meticulous testing and validation. Leveraging your knowledge in data analysis and automation testing, you will mentor others, navigate complex testing environments, and uphold quality standards throughout the software development lifecycle. This role offers an exciting opportunity to work with advanced BI tools and contribute to continuous improvement initiatives in a dynamic team setting. Key Responsibilities: ETL Development & Data Engineering - Design, build, and maintain scalable ETL pipelines using Azure Data Factory, Databricks, and custom Python scripts. - Integrate and ingest data from on-prem, cloud, and third-party APIs into modern data platforms. - Perform data cleansing, validation, and transformation to ensure data quality and consistency. - Machine learning experience is desirable. Programming and Scripting - Write robust and reusable Python scripts for data processing, automation, and orchestration. - Develop complex SQL queries for data extraction, transformation, and reporting. - Optimize code for performance, scalability, and maintainability. Cloud & Platform Integration - Work within Azure ecosystems, including Blob Storage, SQL Database, ADF, Synapse, and Key Vault. - Utilize Databricks (PySpark/Delta Lake) for advanced transformations and big data processing. - PowerBI hands-on experience is a plus. Collaboration And Communication - Collaborate closely with cross-functional teams to ensure quality throughout the software development lifecycle. - Provide regular status updates and test results to stakeholders. - Participate in daily stand-ups, sprint planning, and Agile ceremonies. Shift time: 2pm to 11pm IST Total experience required: 4-9 years,
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Senior Database Developer at Kobie's India Tech Hub, you will have the opportunity to be one of the first hires in an exciting new venture. Kobie, a leading player in the loyalty industry, is expanding its global footprint by establishing a Tech Hub in India. This initiative aims to create deep connections with customers through personalized, data-driven loyalty experiences, further enhancing enterprise value through loyalty. Your role will involve designing scalable, data-driven solutions for high-impact loyalty platforms, leveraging your expertise in PL/pgSQL, efficient SQL queries, performance tuning, and ETL workflows. You will work with Oracle and/or PostgreSQL databases, handling complex data structures to support data integration, transformation, and analytics. As a key member of the team, you will be responsible for developing and maintaining database solutions that facilitate client onboarding, reward processing, data quality assurance, and operational performance. Collaboration with cross-functional teams such as developers, QA specialists, analysts, and DBAs will be essential to optimize data pipelines and queries, ensuring they meet the evolving needs of clients and marketing platforms. Your impact will be significant as you contribute to import and extract processes, data migration efforts, troubleshooting data quality and performance issues, tuning queries for optimal performance, supporting data integration from various sources, and providing technical assistance to stakeholders. Your ability to work both independently and collaboratively, manage priorities effectively, and communicate with technical and non-technical team members will be crucial for success. To excel in this role, you should have 5-7+ years of experience in SQL query design and maintenance, proficiency in Oracle and/or PostgreSQL, expertise in performance tuning, ETL development, data security, and a track record of working in team environments. Bonus skills include experience with data mapping tools, modern cloud platforms like Snowflake, job scheduling automation, version control systems, and supporting Java-based development teams. At Kobie, known for its award-winning culture and innovative loyalty solutions, you will be part of a collaborative and growth-focused environment. As a trusted partner to global brands, Kobie focuses on building lasting emotional connections with consumers through strategy-led technology solutions. The launch of the India Tech Hub presents an exciting opportunity to be part of a culture that values diversity, equity, inclusion, and giving back to the community. Joining Kobie means access to competitive benefits, comprehensive health coverage, well-being perks, flexible time off, and opportunities for career growth. The integration of new teammates in India with U.S. teams, exposure to global projects, and the future establishment of a physical office in Bengaluru emphasize Kobie's commitment to collaboration and connection. This is your chance to be part of something significant and shape the future of the Kobie India Tech Hub. Apply now and contribute to delivering innovative customer experiences for renowned brands while working alongside industry leaders in the loyalty space.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are excited to explore a new opportunity as an Ivalua Consultant with LTIM, a leading company in the industry. With a minimum of 5 years of experience, your expertise in handling Ivalua as a procurement application, specifically in the source to contract module, alongside software deployment processes, is commendable. Your responsibilities will include configuring and deploying the Ivalua system in alignment with documented business needs, focusing on org structures, profiles, authorizations, alerts, notifications, and workflow configurations. Your skills in designing configuration, including rules, fields, filters, and pages, will play a crucial role in the success of the projects. Your proficiency in SQL queries, ETL development, EAI, and Integration will be valuable assets in this role. Additionally, your experience in configuring reports, spend analysis, and deploying classification rules will be essential. Your functional knowledge of Sourcing Module, Supplier Module, Contract Module, and Spend module will be put to good use. Being familiar with ITIL V3 processes and having an overview of ITIL V4 processes in AMS knowledge is necessary. If you possess Ivalua S2C certification, it will be considered an advantage; otherwise, you will be guided through the certification process to enhance your skills as per the client's requirements. Your role will also involve managing client relationships, demonstrating strong leadership, exceptional communication skills, and effective project management. You will act as a mentor to colleagues and project teams, ensuring successful project milestones are achieved. Your educational background in Bachelor in engineering or BS/BA degree in Procurement, Computer Science, or related fields is preferred. If you are passionate about configuring solutions based on complex requirements and coordinating efforts with team members, this role is tailored for you. To be a part of this dynamic team at LTIM, based in Pan India locations, share your resume with us at ashwini.sakpal@ltimindtree.com and embark on a rewarding career journey as an Ivalua Consultant.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
You are an experienced ETL Developer with over 5 years of expertise in data transformation, integration, and visualization. Your role will involve designing, implementing, and maintaining ETL processes using SQL Server Integration Services (SSIS) for efficient data extraction, transformation, and loading. You will be responsible for developing data transformation and integration solutions, ensuring data consistency and accuracy across systems. Performing data cleansing, validation, and enrichment to ensure high-quality data for reporting and analysis will also be part of your responsibilities. Analyzing large datasets to uncover trends, patterns, and insights, as well as performing data mining to support decision-making processes, will be crucial. You will create and maintain visualizations to present data insights effectively to stakeholders. Developing and maintaining database objects including tables, stored procedures, triggers, and user functions within a SQL Server environment is another key aspect of your role. Your expertise in implementing end-to-end BI solutions using the MSBI stack, including SSRS (SQL Server Reporting Services) and SSAS (SQL Server Analysis Services), will be utilized. Additionally, you will design and manage data warehouse solutions, optimize T-SQL query performance, and create and manage reusable SSIS components for streamlined ETL processes. Contributing to physical and logical database design, data mapping, and table normalization will also be part of your responsibilities. Your role will require you to identify dimensions, facts, measures, and hierarchies for data migration to SQL Server databases. Utilizing DMVs, SQL Profiler, and Extended Events for database performance optimization, debugging, and tuning will be essential. Working with Microsoft Azure cloud services including Azure Blob Storage, Azure SQL Server, and Azure Data Factory is also a part of this role. You will use GIT (Azure DevOps) for version control and collaboration, and participate in Agile/SCRUM methodologies for project management and development. Minimum qualifications include a minimum of 5 years of experience in ETL development and data warehousing, advanced proficiency in T-SQL, SSIS, and SQL Server database management, experience with MSBI stack including SSRS and SSAS, knowledge of data warehousing methodologies and concepts, skills in optimizing T-SQL queries and database performance, experience with Microsoft Azure cloud services, familiarity with Agile/SCRUM project management methodologies, and proficiency in GIT version control and Azure DevOps. Preferred qualifications include a Bachelor's degree in Computer Science, Information Systems, or a related field, certifications in Microsoft SQL Server or Azure Data Technologies, and experience with additional BI tools and data visualization platforms.,
Posted 2 days ago
3.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Primary Responsibilities: Be a team player in an agile team within a release team / value stream Develop and automate business solutions by creating new and modifying existing software applications Be technically hands-on and excellent in Design, Coding and Testing End to End, product quality Participate and contribute to Sprint Ceremonies Promote and develop the culture of collaboration, accountability, and quality Provide technical support to the team and help the team in resolving technical issues Closely work with Tech Lead, Onshore partners, deployment, and infrastructure teams Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent experience 3+ years of experience working in Data warehousing and Data Mart Platforms 3+ years of working experience in warehousing ecosystemDesign & Development, scheduling jobs using Airflow, running, and monitoring refreshes 3+ years of working experience in Big Data Technologies around Spark Or PySpark and Databricks 3+ years of working experience in Agile team 2+ years of working experience in cloud and Dev Ops technologies preferably on AzureDocker/ Kubernetes/Terraform/Chef Working experience in CI/CD pipeline (test, build, deployment and monitoring automation) Knowledge of software configuration management and packaging Demonstrates excellent problem-solving skills Preferred Qualification: 3+ years of working experience in ELT/ETL Design & Development and solid experience in SQL on Teradata and Snowflake At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyoneof every race, gender, sexuality, age, location and incomedeserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #Nic
Posted 3 days ago
5.0 - 7.0 years
10 - 15 Lacs
Chennai
Hybrid
Designation - Module Leader Role - ETL Developer Location - Chennai Notice Period - Immediate to 30 days Experience range b/w 5 - 7 years of Development experience in the Amazon Cloud Environment AWS (S3, AWS Glue, Amazon Redshift, Data Lake Experience in SSRS Expereince in SSIS Experience in ETL Experience in Power Bi Experience in Aws glue Create ETL jobs using Python/PySpark to fulfill the requirements. Ability to perform data manipulations, load, extract from several sources of data into another schema. Good experience with project management practices, proficiency with Agile and Waterfall methodologies and working with scrum teams and timely reporting. Experience with software development Life cycle and all the phases. 7 plus years Database development experience. Understanding of core AWS services, and basic AWS architecture best practices. AWS Technologies S3, AWS Glue, RDS, lambda, cloud watch, etc. Troubleshoot and resolve issues related to data quality, performance, and reliability. Document ETL processes and workflows for future reference and be able to demo completed demo. Optimize and maintain existing ETL processes to ensure high performance and efficiency. Strong analytical and collaboration skills and a team player. Excellent problem-solving and troubleshooting skills. Self-starter and be able to learn and adopt quickly. Strong verbal and written communication skills with an ability to understand frontend users requirements. Note: Work timings 1pm - 11pm Interested Candidates can also share their updated resume at megha.chattopadhyay@aspiresys.com
Posted 3 days ago
1.0 - 3.0 years
5 - 11 Lacs
Pune, Gurugram
Hybrid
Role & responsibilities Collaborate with ZS internal teams and client teams to shape and implement high quality technology solutions that address critical business problems Understand and analyze business problems thoroughly, and translate them into technical designs effectively Design and implement technical features using best practices for the specific technology stack being used Assist in the development phase of implementing technology solutions for client engagements, ensuring effective problem-solving Apply appropriate development methodologies (e.g., agile, waterfall, system integrated testing, mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of projects Provide guidance and support to team members in creating comprehensive project implementation plans Work closely with a development team to accurately interpret and implement business requirements Preferred candidate profile Bachelor's or Master's degree in Business Analytics, Computer Science, MIS or related field with academic excellence Proficiency in RDBMS concepts, SQL, and programming languages such as Python Strong analytical and problem-solving skills to convert intricate business requirements into technology solutions Knowledge of algorithms and data structures Additional Skills: 1-3+ years of relevant professional experience in delivering small/medium-scale technology solutions Strong verbal and written communication skills to effectively convey results and issues to internal and client teams Familiarity with Big Data Concepts and Cloud Platforms like AWS, Azure, and Google Cloud Platform Understanding of productivity tools such as co-pilot and SQL generation Travel to other offices as required to collaborate with clients or internal project teams Perks and benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections.
Posted 3 days ago
10.0 - 17.0 years
20 - 30 Lacs
Pune
Hybrid
We have opening for " Princpal IT Engineer Applications - Data Engineer " role, with one of the top US Product based MNC, Pune. Total exp.-9-12 years,NP-upto 60 days Shift timing- 2 PM to 11 PM Location- Pune, Hinjewadi (Phase 2) PFB Must have skills : Minimum of 8 years of hands-on experience in software or data engineering roles Deep understanding of data modeling and data architecture design; must be a well-rounded technical expert Strong expertise in SQL scripting and performance tuning within large-scale data environments Proven experience working with high-volume data systems Demonstrated ability to solve complex problems using Informatica Excellent communication skills with the ability to clearly articulate and explain technical scenarios Experience in building data products or solutions from the ground up Proficient in Python, Shell scripting, and PL/SQL, with a strong focus on automation Must have strong hands-on experience; not seeking candidates focused on team management or leadership roles Good to have skills: Experience supporting Medicare, Medicaid, or other regulatory healthcare data platforms Nice to have certifications in Informatica Cloud, Oracle, Databricks, and cloud platforms (e.g., Azure, AWS) Kindly mail your CV's @ silpa.pa@peoplefy.com
Posted 3 days ago
12.0 - 17.0 years
12 - 17 Lacs
Pune
Work from Office
Role Overview: The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities: Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills: Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
The Business Intelligence (BI) Specialist position at Beamstacks on MG Road in Gurgaon involves a range of responsibilities related to the design, development, implementation, management, and support of enterprise BI reporting and ETL processes. The ideal candidate will have exposure to OBIEE Development and Administration, with at least 6 years of development experience in PL/SQL. Key responsibilities include developing OBIEE Repository at three layers (Physical, Business model, and Presentation Layers), creating Interactive Dashboards with drill-down capabilities using global and local Filters, and setting up security configurations. Additionally, the role requires 3 years of experience in Data Modeling, ETL Development, and installation/configuration of ETL and BI Tools, including Oracle APEX. The successful candidate will have hands-on experience with OBIEE (version 11g or higher), Data Modeling, and installing/configuring Oracle OBIEE in multiple life cycle environments. They should be adept at presenting system architectures to management and technical stakeholders, and possess a strong technical and functional understanding of Oracle OBIEE Technologies. Other requirements for this role include good knowledge of OBIEE Admin, best practices, DWBI implementation challenges, and Data warehousing. The candidate must hold OBIEE Certification on version 11g or higher and have experience with ETL tools and HP Vertica. Domain knowledge in areas such as Supply Chain, Retail, and Manufacturing is preferred. The BI Specialist will be responsible for developing architectural solutions using OBIEE, providing effort estimates and timelines to project management, and collaborating with Business and IT teams to drive projects forward on a daily basis. They will also lead the development of OBIEE dashboards and reports and work closely with internal stakeholders and development teams throughout the project lifecycle.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The position is for an Officer / Assistance Manager based in Mumbai. The ideal candidate should have a qualification of B.E. / MCA / B.Tech / M.sc (I.T.) and an age limit between 25-30 years. You should have a minimum of 2-3 years of ETL development experience with a strong knowledge of ETL ideas, tools, and data structures. It is essential to have the capability to analyze and troubleshoot complex data sets and determine data storage needs. Familiarity with data warehousing concepts to build a data warehouse for internal departments of the organization is required. Your responsibilities will include creating and enhancing data solutions to enable seamless delivery of data, collecting, parsing, managing, and analyzing large sets of data. You will lead the design of the logical data model, implement the physical database structure, and construct and implement operational data stores and data marts. Designing, developing, automating, and supporting complex applications to extract, transform, and load data will be part of your role. You must ensure data quality at the time of ETL, develop logical and physical data flow models for ETL applications, and have advanced knowledge of SQL, Oracle, SQOOP, NIFI tools commands, and queries. Current CTC and Expected CTC should be clearly mentioned. To apply, please email your resume to careers@cdslindia.com with the position applied for in the subject column.,
Posted 3 days ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Role : ETL Development Location: Hyderabad(Kokapet) Required Experience: 7+ Years Interview Mode: First level Technical Virtual - HR Round (Face to Face) Mode of hire : Permanent 4 Days Work From Office (Regular Dayshifts) Must Have: ETL Development ,Data Stage and ADF. Share CV's to: aravind.kuppili@otsi.co.in Role & responsibilities Education: Bachelors degree in computer and information technology or a related field, such as engineering. Minimum 7 years of development experience in Datastage (Version V8.5 or higher) with experience in processing high volume jobs 7+ years experience in Advanced Infosphere DataStage Design and ADF Development. 3+ years in DB2 UDB Administration and Support 2+ years Report Solution/Design experiences. Experience Datastage 11.3 and 8.7 is a must Strong experience with UNIX and shell scripting Mastery level on Datastage 8.7 server and parallel versions Write ETL Technical Specifications 3+yrsAzure Data Factory (ADF)/ Microsoft SQL Server Jobs: Using ADF connectors to connect to various data sources and destinations. Implementing data flows within ADF for complex transformations. Scheduling and monitoring ADF pipelines. Creating and managing pipelines in ADF to orchestrate data movement and transformation Creating and managing SQL Server Agent jobs for automated tasks, including ETL processes. Writing and optimizing SQL queries for data extraction and transformation. Using SSIS packages for complex ETL operations within SQL Server. Troubleshooting and resolving issues related to SQL Server jobs and SSIS packages. Preferred candidate profile
Posted 3 days ago
5.0 - 7.0 years
18 - 20 Lacs
Pune
Work from Office
Critical Skills to Possess: 5+ years of experience in data engineering or ETL development. 5+ years of hands-on experience with Informatica. Experience in production support , handling tickets, and monitoring ETL systems. Strong SQL skills with experience in querying large datasets. Familiarity with data warehousing concepts and design (e.g., star schema, snowflake schema). Experience with relational databases such as Oracle, SQL Server, or PostgreSQL. Knowledge of cloud platforms such as AWS, Azure, or GCP is a plus. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Roles and Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica . Work with data architects and business stakeholders to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based systems. Optimize and troubleshoot existing Informatica workflows for performance and reliability. Monitor ETL workflows and proactively address failures, performance issues, and data anomalies. Respond to and resolve support tickets related to data loads, ETL job failures, and data discrepancies. Provide support for production data pipelines and jobs Ensure data quality and consistency across different systems and pipelines. Implement data validation, error handling, and auditing mechanisms within ETL processes. Collaborate with data analysts, data scientists, and other engineers to ensure a consistent and accurate data platform. Maintain documentation of ETL processes, data flows, and technical designs. Monitor daily data loads and resolve any ETL failures or data quality issues.
Posted 4 days ago
6.0 - 11.0 years
3 - 7 Lacs
New Delhi, Chennai, Bengaluru
Work from Office
Must have 2+ years of working experience in designing ETL flows with Pentaho Data Integration Develop ETL jobs based on Requirements given. Self-Efficient to understand the existing ETL Jobs / Reports. Analyses data from various sources, including databases and flat files. Discuss with the customers/end users and gather requirements. Build Customize ETL/BI reports. Identify the Problematic Areas in ETL /Loads and need to fix by Performance Improvement. Must have 3+ years of experience in developing and tuning complex SQL queries using Oracle, PostgreSQL or other leading DBMS. Need to Write the SQL Scripts to validate the data. Need to Support Daily Loads able to fix the issues within SLA. Hands on data migration Able to setup reporting for a new client Able to write Linux script Provide production support Create Source to Target (Mapping) Documents Required Technical and Professional Expertise: Hands on experience on ETL Tools Like Pentaho. Hands on experience on any one Reporting tools like Pentaho BI, Microsoft Power BI, Oracle BI, Tableau etc. Experience in a data warehousing role with a solid understanding of data warehousing approaches and best practices. Strong Hands-on Experience writing SQL Scripts to analyse and validate the Data. Should have expert knowledge in writing SQL commands, queries, and stored procedures Strong knowledge of DB and DW concepts. Functional Knowledge/understanding on Finance, reconciliation, Customer service, Pricing modules etc. Excellent SQL, PL/SQL, XML, JSON and Database Skills Good to have some experience with Python or Javascript Good to have some knowledge on Kubernetes, Ansible Good to have some knowledge on Linux script Education UG: Any Graduate Key Skills ETL , Pentaho, PL/SQL, Etl Development, ETL Tool, SQL
Posted 4 days ago
6.0 - 8.0 years
8 - 18 Lacs
Pune
Hybrid
Job Title: Lead ETL Developer Job Location: Pune Job Description: Company Introduction Join Nitor Infotech, an Ascendion company, where we harness data to drive impactful solutions. Our innovative team is dedicated to excellence in data processing and analytics, making a significant difference in the retail domain. Be part of a collaborative environment that values your expertise and contributions. Job Overview We are seeking an ETL Developer with expertise in Advanced SQL, Python, and Shell Scripting. This full-time position reports to the Data Engineering Manager and is available in a hybrid work model. This is a replacement position within the SRAI - EYC Implementation team. Key Responsibilities Design and develop ETL processes for data extraction, transformation, and loading. Utilize Advanced SQL for data processing and analysis. Implement data processing solutions using Python and Shell Scripting. Collaborate with cross-functional teams to understand data requirements. Maintain and optimize data pipelines for performance and reliability. Provide insights and analysis to support business decisions. Ensure data quality and integrity throughout the ETL process. Stay updated on industry trends and best practices in data engineering. Must-Have Skills and Qualifications 7-8 years of experience as an ETL Developer. Expertise in Advanced SQL for data manipulation and analysis. Proficient in Python and Shell Scripting. Foundational understanding of Databricks and Power BI. Strong logical problem-solving skills. Experience in data processing and transformation. Understanding of the retail domain is a plus. Good-to-Have Skills and Qualifications Familiarity with cloud data platforms (AWS, Azure). Knowledge of data warehousing concepts. Experience with data visualization tools. Understanding of Agile methodologies. What We Offer Competitive salary and comprehensive benefits package. Opportunities for professional growth and advancement. Collaborative and innovative work environment. Flexible work arrangements. Impactful work that drives industry change. DEI Statement At Nitor Infotech, we embrace diversity and inclusion. We actively foster an environment where all voices are heard and valued. ISMS Statement Nitor Infotech maintains ISO 27001 certification. All employees must adhere to our information security policies.
Posted 4 days ago
5.0 - 8.0 years
12 - 18 Lacs
Bengaluru
Work from Office
• Bachelor's degree in computer science, Information Technology, or a related field. • 3-5 years of experience in ETL development and data integration. • Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, or MySQL. • Familiarity with data warehousing concepts and methodologies. • Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar. • Knowledge of data modeling and data governance best practices. • Strong analytical skills and attention to detail. • Excellent communication and teamwork skills. • Experience with Snowflake or willingness to learn and implement Snowflake-based solutions. • Experience with Big Data technologies such as Hadoop or Spark. • Knowledge of cloud platforms like AWS, Azure, or Google Cloud and their ETL services. • Familiarity with data visualization tools such as Tableau or Power BI. • Hands-on experience with Snowflake for data warehousing and analytics
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Developer specializing in SnapLogic and Apache Airflow, you will be responsible for designing, developing, and maintaining enterprise-level data integration solutions. Your expertise in ETL development, workflow orchestration, and cloud technologies will be crucial for automating data workflows, optimizing performance, and ensuring the reliability and scalability of data systems. Your key responsibilities will include designing, developing, and managing ETL pipelines using SnapLogic to ensure efficient data transformation and integration across various systems and applications. You will leverage Apache Airflow for workflow automation, job scheduling, and task dependencies to ensure optimized execution and monitoring. Collaboration with cross-functional teams such as Data Engineering, DevOps, and Data Science will be essential to understand data requirements and deliver effective solutions. In this role, you will be involved in designing and implementing data pipeline architectures to support large-scale data processing in cloud environments like AWS, Azure, and GCP. Developing reusable SnapLogic pipelines, integrating with third-party applications and data sources, optimizing pipeline performance, and providing guidance to junior developers will be part of your responsibilities. Additionally, troubleshooting pipeline failures, implementing automated testing, continuous integration (CI), and continuous delivery (CD) practices for data pipelines will be crucial for maintaining high data quality and minimal downtime. The required skills and experience for this role include at least 6 years of hands-on experience in data engineering with a focus on SnapLogic and Apache Airflow. Proficiency in SnapLogic Designer, SnapLogic cloud environment, and Apache Airflow for building data integrations and ETL pipelines is essential. You should have a strong understanding of ETL concepts, data integration, cloud platforms like AWS, Azure, or Google Cloud, data storage systems such as S3, Azure Blob, and Google Cloud Storage, as well as experience with SQL, relational databases, NoSQL databases, REST APIs, and CI/CD pipelines. Your problem-solving skills, ability to work in an Agile development environment, and strong communication and collaboration skills will be valuable assets in this role. By staying current with new SnapLogic features, Airflow upgrades, and industry best practices, you will contribute to the continuous improvement of data integration solutions. Join our team at Virtusa, where teamwork, quality of life, and professional development are values we embody. Be part of a global team that cares about your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career with us. At Virtusa, great minds come together to nurture new ideas and foster excellence in a dynamic environment.,
Posted 5 days ago
1.0 - 7.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Skillsoft is seeking an experienced Data Integration Engineer to support and modernize our data integration processes. This role is responsible for managing the traditional ETL lifecycle while driving the transition to event-driven, API- based solutions. The ideal candidate will support existing systems while driving operational excellence and modernization initiatives. Opportunity Highlights: ETL Development & Data Management Design, develop, and optimize ETL processes to integrate data from multiple sources. Ensure the data integrity, accuracy, and security across all integration workflows. Troubleshoot and resolve ETL job failures, optimizing performance and throughput. Database Administration & Support: Support schema design, indexing strategies, and query optimization for efficient data retrieval . Provide database administration support for ETL workflows and integration projects. Modernization & Innovation: Drive the transition from traditional ETL processes to modern, event-driven, API-based data integration solutions. Develop and implement strategies for data process modernization. Explore and implement AI/ML-driven automation for API-based integration workflows. Stay updated with the latest trends and technologies in data integration and apply them to improve existing systems. Operational Excellence: Support and maintain existing data integration systems. Optimize data pipelines for performance and efficiency. Collaborate with cross-functional teams to understand data needs and deliver effective solutions. Define and monitor KPIs for data integration and database performance. Skills & Qualifications Proven experience in managing traditional ETL lifecycles. Strong knowledge of event-driven architectures and API-based data integration. Proficiency in SQL and experience with database management systems. Ability to create and modify C# scripts within SSIS for custom API integrations. Experience with cloud-based data integration tools and platforms. Experience in working with Agile/Scrum environments. Effective communication and collaboration skills. Ability to manage multiple priorities and deliver in a fast-paced environment. A passion for innovation and continuous improvement. 5-10 years of experience in ETL development, data integration, and database administration.
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should be a certified Oracle Database PLSQL Developer Professional for the role of Mantas Scenario Developer. Your responsibilities will include utilizing Mantas OFSAA FCCM and scenario manager to customize scenarios. It is imperative that you possess a strong understanding of Oracle DB and PLSQL, along with the ability to design and develop load processes and queries for large databases with high transaction volumes. Your problem-solving and debugging skills should be top-notch, and you should be able to communicate effectively and collaborate well within a team. Your role will entail designing high-quality deliverables in alignment with business requirements, adhering to defined standards and design principles. You will also be responsible for developing and maintaining highly scalable ETL applications, integrating code following CI/CD practices using Maven and Udeploy, and reviewing code modules developed by other team members. Experience in Agile Methodology, analyzing existing codes to resolve production issues, and working with ETL development tools and data warehousing is essential. Additionally, having knowledge of SQL, query tuning, performance tuning, Agile development models, and scrum teams is crucial. Previous experience in the banking domain, design experience in ETL technology, and proficiency in PL SQL and data warehousing fundamentals are considered advantageous for this role.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are a skilled Snowflake + Python + SQL Developer with 4-6 years of experience, ready to join a dynamic team. Your expertise lies in cloud data platforms, Python programming, and SQL database management. While experience with DBT (Data Build Tool) is a plus, it's not mandatory for this role. In this role, your primary responsibilities include designing, implementing, and managing data pipelines using Snowflake. You will also be developing and optimizing complex SQL queries for data extraction, transformation, and reporting, as well as handling large-scale data processing and integration using Python. Data modeling and optimization are crucial aspects of your role. You will develop and maintain Snowflake data models and warehouse architecture, optimizing data pipelines for performance and scalability. Collaboration is key as you work closely with cross-functional teams to understand data needs and provide efficient solutions. ETL development is another essential part of your role. You will develop and maintain ETL/ELT processes to support data analytics and reporting, utilizing Python scripts and Snowflake tools for data transformation and integration. Monitoring performance, troubleshooting issues, and ensuring data integrity are also part of your responsibilities. While leveraging DBT for data transformation within Snowflake is optional, it is considered advantageous. You may also develop and maintain DBT models to enhance the quality of data transformations. Your key skills and qualifications include hands-on experience with Snowflake, Python, and SQL, a strong understanding of SQL databases and data modeling concepts, and experience in building scalable data pipelines and ETL/ELT processes using Python and Snowflake. Having knowledge of data warehousing best practices, familiarity with cloud platforms such as AWS, Azure, or GCP, and an understanding of version control systems like Git are also beneficial for this role.,
Posted 1 week ago
8.0 - 13.0 years
15 - 20 Lacs
Pune
Work from Office
8+ years of experience in ETL/DW projects, having migration experience and team management having delivery experience. Proven expertise in Snowflake data warehousing, ETL, and data governance. Experience with cloud ETL/ETL migration tools.
Posted 1 week ago
7.0 - 12.0 years
15 - 20 Lacs
Bengaluru
Hybrid
Role & responsibilities - 7 years of experience in modeling and business system designs. - 5 years hands on experience in SQL and Informatica ETL development is must. - 3 years of Redshift or Oracle (or comparable database) experience with BI/DW deployments. - Must have proven experience with STAR and SNOWFLAKE schema techniques. - Development experience in minimum 1 year in Python scripting is mandatory. Having Unix scripting is an added advantage - Proven track record as an ETL developer in delivering successful business intelligence developments with complex data sources. - Strong analytical skills and enjoys solving complex technical problems.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough