Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 - 12.0 years
10 - 14 Lacs
Pune
Work from Office
Before you apply to a job, select your language preference from the options available at the top right of this page. : Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Full stack developer with Java, Oracle and Angular. Devops and Agile project management is a plus. Plans, develops, and manages the organization's information software, applications, systems, and networks. Application Containerization (Kubernetes, Red Hat Open Shift) Experience with public cloud (e.g., Google, Azure) Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Ensures UPS's business needs are met through continual upgrades and development of new technical solutions. Qualifications: 8-12 years of experience Bachelors Degree or International equivalent Employee Type:
Posted 1 day ago
5.0 - 8.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners . KPI Partners is a leading provider of technology consulting and solutions, specializing in delivering high-quality services that enable organizations to optimize their operations and achieve their strategic objectives. We are committed to empowering businesses through innovative solutions and a strong focus on customer satisfaction. Job Description. We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities. - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.). - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment. Why Join KPI Partners? - Opportunity to work with a talented and diverse team on cutting-edge projects. - Competitive salary and comprehensive benefits package. - Continuous learning and professional development opportunities. - A culture that values innovative thinking and encourages collaboration. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.**
Posted 1 day ago
12.0 - 17.0 years
3 - 7 Lacs
Kolkata
Work from Office
Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : any graduate Summary :As a Data Management Practitioner, you will be responsible for maintaining the quality and compliance of an organization's data assets. Your role involves designing and implementing data strategies, ensuring data integrity, enforcing governance policies, and optimizing data usage within the organization. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Design and advise on data quality rules- Set up effective data compliance policies- Ensure data integrity and enforce governance policies- Optimize data usage within the organization Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Strong understanding of data management best practices- Experience in designing and implementing data strategies- Knowledge of data governance and compliance policies- Ability to optimize data usage for organizational benefit Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Kolkata office- A degree in any graduate is required Qualification any graduate
Posted 5 days ago
8.0 - 10.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Job Track Description: The ETL Data Architect would be responsible for driving data migration strategy and execution within complex Enterprise landscape. This role will be responsible to bring in the best practices of data migration and integrations with Salesforce. Bring in best practices for Salesforce data migration integration. Create Data migration strategy for Salesforce implementations. Define template/uniform file format for migrating data into Salesforce. Must Skill: Data Architect with 8-10 years of ETL experience and 5+ years of Informatica Cloud (IICS, ICRT) experience. 5+ years of experience on Salesforce systems. Develop comprehensive data mapping and transformation plans to align data with Salesforce data model and software solution. Good understanding of Salesforce data model and schema builder. Excellent understanding of relational database concepts and how to best implement database objects in the Salesforce. Experience integrating large sets of data into Salesforce from multiple data sources. Experience with EDI transactions. Experience in Design and Development of ETL/Data Pipelines. Excellent understanding of SOSL and SOQL and the Salesforce Security model. Full understanding of project life cycle and development methodologies. Ability to interact with technical and functional teams. Excellent oral, written communication and presentation skills. Should be able to work in offshore / onsite model. Experience: Expert in ETL development with Informatica cloud using various connectors. Experience with Real Time integrations and Batch scripting. Expert in implementing the business rules by creating various transformations, working with multiple data sources like flat files, relational and cloud database, etc. and developing mappings. Experience in using ICS workflow tasksSession, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session, and Pre/Post commands. Ability to migrate objects in all phases (DEV, QA/UAT and PRD) following standard defined processes. Performance analysis with large data sets Experience in writing technical specifications based on conceptual design and stated business requirements. Experience in designing and maintaining logical and physical data models and communicates to peers and junior associates using flowcharts, unified data language, Data flow Diagram. Good Knowledge of SQL, PL/SQL and Data Warehousing Concepts. Experience in using Salesforce SOQL is a plus. Responsibilities: Excellent troubleshooting and debugging skills in Informatica Cloud. Significant knowledge of PL/ SQL including tuning, triggers, ad hoc queries, and stored procedures. Strong analytical skills. Works under minimal supervision with some latitude for independent judgement. Prepare and package scripts and code across development, test, and QA environments. Participate in change control planning for production deployments. Conducts tasks and assignments as directed.
Posted 6 days ago
3.0 - 8.0 years
35 - 50 Lacs
Bengaluru
Work from Office
About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. PS: This role is with one of our clients who is a leading name in Retail Industry.
Posted 1 week ago
10.0 - 15.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Req ID: 322003 We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karntaka (IN-KA), India (IN). Strong hands-on experience in SQLs, PL/SQLs [Procs, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp preferred]"‚"‚"‚"‚ Experience on MS-SQL [preferred], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred)"‚"‚"‚"‚"‚"‚"‚"‚"‚ Total Exp7 "“ 10 Yrs.
Posted 1 week ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Req ID: 321918 We are currently seeking a ETL and BDX developer to join our team in Bangalore, Karntaka (IN-KA), India (IN). "¢ Develop and maintain Power BI dashboards, reports, and datasets. "¢ Collaborate with stakeholders to gather and analyse business requirements. "¢ Design robust and scalable data models using Power BI and underlying data sources. "¢ Write complex DAX expressions for calculated columns, measures, and KPIs. "¢ Optimize performance of Power BI reports and data models. "¢ Integrate Power BI with other data sources (SQL Server, Excel, Azure, SharePoint, etc.). "¢ Implement row-level security and data access control. "¢ Automate data refresh schedules and troubleshoot refresh failures. "¢ Mentor junior developers and conduct code reviews. "¢ Work closely with data engineering teams to ensure data accuracy and integrity. "¢ Exp working on Power Query and data flows. "¢ Strong in writing SQL queries Total Exp7 "“ 10 Yrs.
Posted 1 week ago
5.0 - 8.0 years
1 - 1 Lacs
Hyderabad
Hybrid
Location: Hyderabad (Hybrid) Key Responsibilities: 1. Data Engineering (AWS Glue & AWS Services): Design, develop, and optimize ETL pipelines using AWS Glue (PySpark). Manage and transform structured and unstructured data from multiple sources into AWS S3, Redshift, or Snowflake. Work with AWS Lambda, S3, Athena, Redshift for data orchestration. Implement data lake and data warehouse solutions in AWS. 2. Infrastructure as Code (Terraform & AWS Services): Design and deploy AWS infrastructure using Terraform. Automate resource provisioning and manage Infrastructure as Code. Monitor and optimize cloud costs, security, and compliance. Maintain and improve CI/CD pipelines for deploying data applications. 3. Business Intelligence (Tableau Development & Administration): Develop interactive dashboards and reports using Tableau. Connect Tableau with AWS data sources such as Redshift, Athena, and Snowflake. Optimize SQL queries and extracts for performance efficiency. Manage Tableau Server administration, including security, access controls, and performance tuning. Required Skills & Experience: 5+ years of experience in AWS Data Engineering with Glue, Redshift, and S3. Strong expertise in ETL development using AWS Glue (PySpark, Scala, or Python). Experience with Terraform for AWS infrastructure automation. Proficiency in SQL, Python, or Scala for data processing. Hands-on experience in Tableau development & administration. Strong understanding of cloud security, IAM roles, and permissions. Experience with CI/CD pipelines (Git, Jenkins, AWS Code Pipeline, etc.). Knowledge of data modeling, warehousing, and performance optimization. Please share your resume to: +91 9361912009
Posted 1 week ago
3.0 - 7.0 years
5 - 15 Lacs
Kolkata
Hybrid
Role & responsibilities We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. B.Tech degree and 5+ years of ETL development experience in Microsoft data track are required. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe.
Posted 1 week ago
3.0 - 8.0 years
3 - 6 Lacs
Chennai
Work from Office
Programming languages/Tools SQL , Datastage , Teradata . Design complex ETL jobs in IBM Datastage to load data into the DWH as per business logic Work experience in Teradata Database as Developer . Understand and analyse ERP reports and document the logic Identify gaps in the existing solutions to accommodate new business processes introduced by the merger Work on designing TAS workflows to replicate data from SAP into the DWH Prepare test cases and technical specifications for the new solutions Interact with other upstream and downstream application teams and EI teams to build robust data transfer mechanisms between various systems.Essential Skills Required Sound interpersonal communication skills Coordinate with customers and Business Analysts to understand business and reporting requirements Support the development of business intelligence standards to meet business goals. Ability to understand Data warehousing concepts and implement reports based on users inputs Area of expertise includes Teradata SQL, DataStage, Teradata , Shell Scripting . Demonstrated focus on driving for results Ability to work with a cross functional teamEmployment Experience Required Minimum 3+ years technical experience in a data warehousing concepts and as ETL developer.
Posted 1 week ago
5.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner).
Posted 1 week ago
3.0 - 5.0 years
6 - 8 Lacs
Chandigarh
Work from Office
Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.
Posted 1 week ago
2.0 - 7.0 years
6 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
CirrusLabs Private Limited is looking for DWH ETL Developer to join our dynamic team and embark on a rewarding career journey Consulting with data management teams to get a big-picture idea of the companys data storage needs. Presenting the company with warehousing options based on their storage needs. Designing and coding the data warehousing system to desired company specifications. Conducting preliminary testing of the warehousing environment before data is extracted. Extracting company data and transferring it into the new warehousing environment. Testing the new storage system once all the data has been transferred. Troubleshooting any issues that may arise. Providing maintenance support.
Posted 1 week ago
2.0 - 6.0 years
3 - 8 Lacs
Pune, Sangli
Work from Office
We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.
Posted 1 week ago
5.0 - 10.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and business intelligence solutions. We are committed to helping organizations excel through effective data management. Our innovative team focuses on delivering impactful business insights, and we are looking for talented individuals to join us on this journey. Job Summary: We are seeking an experienced ETL Developer with expertise in Oracle Data Integrator (ODI) to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to extract data from multiple sources and transform it into a format suitable for analysis. You will work closely with business analysts, data architects, and other stakeholders to ensure the successful implementation of data integration solutions. Key Responsibilities: - Design and implement ETL processes using Oracle Data Integrator (ODI) to support data warehousing and business intelligence initiatives. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop, test, and optimize ETL workflows, mappings, and packages to ensure efficient data loading and processing. - Perform data quality checks and validations to ensure the accuracy and reliability of transformed data. - Monitor and troubleshoot ETL processes to resolve issues and ensure timely delivery of data. - Document ETL processes, technical specifications, and any relevant workflows. - Stay up-to-date with industry best practices and technology trends related to ETL and data integration. Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as an ETL Developer with a focus on Oracle Data Integrator (ODI). - Strong understanding of ETL concepts, data warehousing, and data modeling. - Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or others. - Familiarity with data integration tools and techniques, including data profiling, cleansing, and transformation. - Experience in performance tuning and optimization of ETL processes. - Excellent analytical and problem-solving skills. - Strong communication and teamwork abilities, with a commitment to delivering high-quality results. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and career advancement. - A collaborative and innovative work environment. - The chance to work on exciting projects with leading organizations across various industries. KPI Partners is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 1 week ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Strong expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Strong proficiency in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Extensive experience in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 1 week ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Basic Qualifications: Expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Proficient in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Expertise in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 1 week ago
6.0 - 10.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Data Analytics Manager What you will do Let’s do this. Let’s change the world. This role will be the Strategic Insights + Analytics (SIA) Team’s resident subject matter expert on reporting design, meaningful metrics, storytelling through reporting and building reports optimized to meet stakeholders’ needs. This person will have expert hands-on Tableau, Power BI, and ETL development skills – including, but not limited to, the ability to quickly build minimum viable product dashboards as required by Amgen leaders and key stakeholders. Finally, this role will also provide consultation support to other reporting + analytics developers in Amgen’s CFO organization. Primary Responsibilities: Provide actionable, expert guidance to the SIA and FIT Reporting + Analytics teams regarding reporting design and development Personally develop key reporting and analytics in Tableau or Power BI in response to critical, just-in-time CFO organization requests Progress quickly developed reports to polished, automated, future-proof end states Help design and implement a SIA/FIT reporting + analytics strategy, which could include but isn’t limited to a full scale reporting migration from the Tableau application to Power BI Stay current with latest reporting + analytics trends and technology and make reporting strategy recommendations as needed to SIA/FIT leadership Collaboration: Partner with both US and India-based SIA and FIT colleagues to achieve shared objectives Report directly to the hiring a senior manager based in Thousand Oaks, California. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Required Skills + Qualifications: Expert skill at reporting design and defining meaningful metrics Expert proficiency at Tableau and Power BI development Advanced “business analyst” skill at grasping and translating business requirements into technical requirements Clear, concise verbal and written communication Development experience with cloud storage and ETL tools such as Databricks and Prophecy Solid understanding of finance concepts, financial statements and financial data Skill in managing large and complex datasets Additional Preferred Experience: Familiarity with Oracle Hyperion, Anaplan, SAP S/4 Hana, Workday and JIRA Ability to work collaboratively with teams and stakeholders outside of SIA/FIT, including cross-functionally Education/ Prior Employment Qualifications: Master’s degree and 5 years of finance or analytics development experience Bachelor’s degree and 8 years of finance or analytics development experience What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
7.0 - 12.0 years
15 - 30 Lacs
Chennai
Hybrid
Join our Mega Tech Recruitment Drive at TaskUs Chennai - where bold ideas, real impact, and ridiculous innovation come together. Who are we hiring for? We are hiring for Developers, Senior Developers, Leads, Architects, and more. When is it happening? 24th June 2025, 9 AM to 4 PM IST. Which skills are we hiring for? Dot Net Full Stack: AWS/Azure + Angular/React/Vue.Js Oracle Fusion: Functional Finance (AP, AR, GL, CM and Tax) Senior Data Engineer: Tableau Dashboard / Clikview / PowerBi, Azure Databricks, PySpark, Databricks SQL, JupyterHub/ PyCharm. SQL Server Database Administrator: SQL Server Admin (Both Cloud & On-Prem) Workday Integration Developer: Workday integration tools (Studio, EIB), Workday Matrix, XML, XSLT Workday Configuration Lead Developer: Workday configuration tools (Studio, EIB), Workday Matrix, XML, XSLT, xPath, Simple, Matrix, Composite, Advanced About TaskUs: TaskUs is a provider of outsourced digital services and next-generation customer experience to fast-growing technology companies, helping its clients represent, protect and grow their brands. Leveraging a cloud-based infrastructure, TaskUs serves clients in the fastest-growing sectors, including social media, e-commerce, gaming, streaming media, food delivery, ride-sharing, HiTech, FinTech, and HealthTech. The People First culture at TaskUs has enabled the company to expand its workforce to approximately 45,000 employees globally. Presently, we have a presence in twenty-three locations across twelve countries, which include the Philippines, India, and the United States. What We Offer: At TaskUs, we prioritize our employees' well-being by offering competitive industry salaries and comprehensive benefits packages. Our commitment to a People First culture is reflected in the various departments we have established, including Total Rewards, Wellness, HR, and Diversity. We take pride in our inclusive environment and positive impact on the community. Moreover, we actively encourage internal mobility and professional growth at all stages of an employee's career within TaskUs. Join our team today and experience firsthand our dedication to supporting People First.
Posted 1 week ago
4.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1646_JOB Date Opened 14/12/2022 Industry Technology Job Type Work Experience 4-6 years Job Title ETL Developer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 Programming languages/Tools SQL , Datastage , Teradata . Design complex ETL jobs in IBM Datastage to load data into the DWH as per business logic Work experience in Teradata Database as Developer . Understand and analyse ERP reports and document the logic Identify gaps in the existing solutions to accommodate new business processes introduced by the merger Work on designing TAS workflows to replicate data from SAP into the DWH Prepare test cases and technical specifications for the new solutions Interact with other upstream and downstream application teams and EI teams to build robust data transfer mechanisms between various systems.Essential Skills Required Sound interpersonal communication skills Coordinate with customers and Business Analysts to understand business and reporting requirements Support the development of business intelligence standards to meet business goals. Ability to understand Data warehousing concepts and implement reports based on users inputs Area of expertise includes Teradata SQL, DataStage, Teradata , Shell Scripting . Demonstrated focus on driving for results Ability to work with a cross functional teamEmployment Experience Required Minimum 3+ years technical experience in a data warehousing concepts and as ETL developer. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2334_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Informatica ETL Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead application development projects Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of data warehousing concepts Experience with cloud-based data platforms Hands-on experience in SQL and database management Good To Have Skills:Experience with Teradata BI Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Mumbai office A minimum of 15 years Full time education is required Qualifications minimum 15 years Full time education
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Conduct regular team meetings to ensure alignment and progress- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills:- Must To Have Skills:Proficiency in Teradata BI- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management- Hands-on experience in developing BI solutions Additional Information:- The candidate should have a minimum of 7.5 years of experience in Teradata BI- This position is based at our Bengaluru office- A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 week ago
10.0 - 15.0 years
35 - 40 Lacs
Pune
Work from Office
The Impact of a Lead Software Engineer - Data to Coupa: The Lead Software Engineer - Data is a pivotal role at Coupa, responsible for leading the architecture, design, and optimization of the data infrastructure that powers our business. This individual will collaborate with cross-functional teams, including Data Scientists, Product Managers, and Software Engineers, to build and maintain scalable, high-performance data solutions. The Lead Software Engineer - Data will drive the development of robust data architectures, capable of handling large and complex datasets, while ensuring data integrity, security, and governance. Additionally, this role will provide technical leadership, mentoring engineers, and defining best practices to ensure the efficiency and scalability of our data systems. Suitable candidates will have a strong background in data engineering, with experience in data modeling, ETL development, and data pipeline optimization. They will also have deep expertise in programming languages such as Python, Java, or Scala, along with hands-on experience in cloud-based data storage and processing technologies such as AWS, Azure, or GCP. The impact of a skilled Lead Software Engineer - Data at Coupa will be significant, ensuring that our platform is powered by scalable, reliable, and high-quality data solutions. This role will enable the company to deliver innovative, data-driven solutions to our customers and partners. Their work will contribute to the overall success and growth of Coupa, solidifying its position as a leader in cloud-based spend management solutions. What You ll Do: Lead and drive the development and optimization of scalable data architectures and pipelines. Design and implement best-in-class ETL/ELT solutions for real-time and batch data processing. Optimize Spark clusters for performance, reliability, and cost efficiency, implementing monitoring solutions to identify bottlenecks. Architect and maintain cloud-based data infrastructure leveraging AWS, Azure, or GCP services. Ensure data security and governance, enforcing compliance with industry standards and regulations. Develop and promote best practices for data modeling, processing, and analytics. Mentor and guide a team of data engineers, fostering a culture of innovation and technical excellence. Collaborate with stakeholders, including Product, Engineering, and Data Science teams, to support data-driven decision-making. Automate and streamline data ingestion, transformation, and analytics processes to enhance efficiency. Develop real-time and batch data processing solutions, integrating structured and unstructured data sources. What you will bring to Coupa: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Expertise in processing large workloads and complex code on Spark clusters. Expertise in setting up monitoring for Spark clusters and driving optimization based on insights and findings. Experience in designing and implementing scalable Data Warehouse solutions to support analytical and reporting needs. Experience with API development and design with REST or GraphQL. Experience building and optimizing big data data pipelines, architectures, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. We are looking for a candidate with 10+ years of experience in a in Data Engineering with at least 3+ years in a Technical Lead role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with object-oriented/object function scripting languages: Python, Java, C++, .net, etc. Expertise in Python is a must. Experience with big data tools: Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift. Working knowledge of stream-processing systems: Storm, Spark-Streaming, etc.
Posted 1 week ago
3.0 - 6.0 years
15 - 25 Lacs
Pune
Work from Office
About the role As an Ab Initio Admin, you will make an impact by leveraging your expertise in data warehousing and ETL processes & driving data integration solutions. You will be a valued member of the AI Analytics group and work collaboratively with CMT team members. In this role, you will: Develop and implement efficient ETL processes using Ab Initio tools to ensure seamless data integration and transformation Collaborate with cross-functional teams to gather requirements and design data solutions that meet business needs Optimize data warehousing solutions by applying advanced scheduling techniques and SQL queries Troubleshoot and resolve data-related issues to maintain system reliability and performance Provide technical expertise in Ab Initio Conduct>It and Co>Operating System to enhance data processing capabilities Ensure data accuracy and consistency by conducting thorough testing and validation of ETL processes Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client or Cognizant office in your respective work location. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs. What you must have to be considered Possess strong knowledge of data warehousing concepts and scheduling basics to design effective data solutions Demonstrate proficiency in ETL processes and SQL for efficient data manipulation and transformation Have hands-on experience with Ab Initio GDE and Conduct>It for robust data integration Utilize Unix Shell Scripting to automate routine tasks and improve operational efficiency These will help you stand out Show expertise in Ab Initio Co>Operating System to optimize data processing workflows Exhibit skills in Unix Shell Scripting to automate tasks and streamline operations Display a collaborative mindset to work effectively in a hybrid work model and day shift environment We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2