Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 2.0 years
3 - 5 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What you'll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What you'll bring : Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Location - Bengaluru,Gurugram,Noida,Pune.
Posted 3 days ago
8.0 - 10.0 years
4 - 4 Lacs
Hyderābād
On-site
Overview: PepsiCo Data BI & Integration Platforms is seeking an experienced Informatica certified professional, for managing and maintaining the Informatica platform – Data Integration – ensuring its smooth operation and security. The ideal candidate will have extensive hands-on experience and deep expertise in platform administration, troubleshooting, and advanced configuration of following Informatica platforms, on on-premises and cloud (AWS/Azure): Informatica Power Center. Informatica Cloud Data Integration (IICS). Informatica Data Integration Hub (DIH). Responsibilities: Informatica platform administration Patching and Upgrades Troubleshooting and Problem Resolution Identifying and resolving platform and data integration job/mappings issues – including performance degradation, connectivity problems, and security breaches. Participating in project planning and change management, including root cause analysis for issues. On-Call Support: Providing on-call support for production environments. Documentation Creating and maintaining documentation of configuration changes, system processes, and troubleshooting procedures. Collaboration Working closely with development, operations, and other teams to support application lifecycle management and ensure smooth operation. Providing support and guidance to users working with the Informatica platform. Installation and Configuration Installing, configuring, and upgrading Informatica software and related infrastructure. Managing repository maintenance, scheduling, grid management, and version control. Configuring and managing the CDI service, including runtime environments, connections, and mappings. Performance Tuning and Optimization Identifying and recommending/implementing performance optimization strategies of Informatica platform and jobs, workflows, and mappings. Security Administration Managing user accounts, roles, and permissions. Configuring security settings, including LDAP and SSL for authentication, enabling Single Sign-on. Ensure compliance with relevant regulations and policies. Automation and Scripting Developing and implementing scripts to automate routine platform tasks and manage the various environments, including integration with Elastic, Splunk and ServiceNow. Developing and implementing automation strategies, including CI/CD pipelines, and analyzing processes for improvements. Monitoring and Alerting Monitoring the Informatica platform for performance, capacity, and availability, identify bottlenecks, and implement optimization strategies. Cloud Infrastructure & Automation Implement and support application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Deploy and optimize cloud-based infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications: A bachelor’s degree in computer science or a related field, or equivalent experience. 8 to 10 years of experience in Informatica platform architecture, operations and security, with at least 8 years in a technical leadership role. Extensive hands-on experience with Informatica platform setup and management: Power Center, IICS, DIH. Certifications in Informatica platform administration like Cloud Data Integration Administration, Informatica Data Integration Hub Administrator, Informatica PowerCenter Administrator, Strong hands-on experience with Windows and Linux administration skills. Extensive hands-on experience with Informatica Cloud Data Integration (CDI): including mapping design, task configuration, and runtime environment management. Strong hands-on experience with SQL queries (DML) and database management (Oracle, SQL Server). Strong understanding of data integration concepts: including ETL, data warehousing, and data quality. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Proficient in scripting and automation tools, such as Bash, PERL, PowerShell, Python, Terraform, and Ansible. Strong expertise in Azure/AWS Databricks platform administration, DevOps, Kubernetes, virtual machines, monitoring and security tools. Strong expertise in Azure/AWS networking and security, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 3 days ago
1.0 - 2.0 years
2 - 4 Lacs
Noida, Gurugram, Bengaluru
Work from Office
What youll do: Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.: agile, waterfall) and best practices (e.g.: mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities: What youll bring: Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements: Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams.
Posted 3 days ago
15.0 years
55 Lacs
Hyderābād
On-site
What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model - Develop and maintain enterprise-wide data governance strategies, standards, and policies. - Align governance practices with business goals like regulatory compliance and analytics readiness. - Define roles and responsibilities within the governance operating model. - Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement - Collaborate across IT, legal, business, and compliance teams to align governance priorities. - Define stewardship models and create enablement, training, and communication programs. - Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms - Design scalable and modular data governance architecture. - Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. - Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture - Lead end-to-end implementation and management of Microsoft Purview. - Configure RBAC, collections, metadata scanning, business glossary, and classification rules. - Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management - Architect metadata repositories and ingestion workflows. - Ensure end-to-end lineage (ADF → Synapse → Power BI). - Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management - Define and enforce rules for data classification, access, retention, and sharing. - Align with GDPR, HIPAA, CCPA, SOX regulations. - Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance - Define KPIs, validation rules, and remediation workflows for enterprise data quality. - Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight - Identify risks and define standards for compliance reporting and audits. - Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration - Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. - Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Required Qualifications: - 15+ years in data governance and management. - Expertise in Microsoft Purview, Informatica, and related platforms. - Experience leading end-to-end governance initiatives. - Strong understanding of metadata, lineage, policy management, and compliance regulations. - Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Job Type: Full-time Pay: ₹5,500,000.00 per year Benefits: Health insurance Internet reimbursement Life insurance Paid time off Provident Fund Schedule: Monday to Friday Work Location: In person
Posted 3 days ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Senior - Clinical Integration Developer EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Clinical Trials Integration Developers with 5+ years of experience in software development within the life sciences domain to support the integration of Medidata’s clinical trial systems across the Client R&D environment. This role offers the chance to build robust, compliant integration solutions, contribute to the design of clinical data workflows, and ensure interoperability across critical clinical applications. You will collaborate closely with business and IT teams, playing a key role in enhancing data flow, supporting trial operations, and driving innovation in clinical research. Your Key Responsibilities Design and implement integration solutions to connect Medidata clinical trial systems with other applications within the clinical data landscape. Develop and configure system interfaces using programming languages (e.g., Java, Python, C#) or integration middleware tools (e.g., Informatica, AWS, Apache NiFi). Collaborate with clinical business stakeholders and IT teams to gather requirements, define technical specifications, and ensure interoperability. Create and maintain integration workflows and data mappings that align with clinical trial data standards (e.g., CDISC, SDTM, ADaM). Ensure all development and implementation activities comply with GxP regulations and are aligned with validation best practices. Participate in agile development processes, including sprint planning, code reviews, testing, and deployment. Troubleshoot and resolve integration-related issues, ensuring stable and accurate data flow across systems. Document integration designs, workflows, and technical procedures to support long-term maintainability. Contribute to team knowledge sharing and continuous improvement initiatives within the integration space. Skills And Attributes For Success Apply a hands-on, solution-driven approach to implement integration workflows using code or middleware tools within clinical data environments. Strong communication and problem-solving skills with the ability to collaborate effectively with both technical and clinical teams. Ability to understand and apply clinical data standards and validation requirements when developing system integrations. To qualify for the role, you must have Experience: Minimum 5 years in software development within the life sciences domain, preferably in clinical trial management systems. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Proficiency in programming languages such as Java, Python, or C#, and experience with integration middleware like Informatica, AWS, or Apache NiFi; strong background in API-based system integration. Domain Knowledge: Solid understanding of clinical trial data standards (e.g., CDISC, SDTM, ADaM) and data management processes; experience with agile methodologies and GxP-compliant development environments. Soft Skills: Strong problem-solving abilities, clear communication, and the ability to work collaboratively with clinical and technical stakeholders. Additional Attributes: Capable of implementing integration workflows and mappings, with attention to detail and a focus on delivering compliant and scalable solutions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks relevant to clinical research. Hands-on experience with clinical R&D platforms such as Oracle Clinical, Medidata RAVE, or other EDC systems. Proven experience leading small integration teams and engaging with cross-functional stakeholders in regulated (GxP) environments. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 3 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 3 days ago
6.0 years
0 Lacs
Sanganer, Rajasthan, India
On-site
Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In This Role, You Will Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment.
Posted 3 days ago
5.0 - 10.0 years
25 - 40 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Responsibilities: Design Snowflake data warehouses and ETL processes using Informatica and Python. Collaborate with Salesforce teams on data integration projects.
Posted 3 days ago
5.0 - 10.0 years
25 - 40 Lacs
Pune
Work from Office
Responsibilities: Design Snowflake data warehouses and ETL processes using Informatica. Develop Python scripts for data automation and analysis on Salesforce platform.
Posted 3 days ago
5.0 - 10.0 years
25 - 40 Lacs
Bengaluru
Work from Office
Responsibilities: Design and implement data architecture using Snowflake, Python, Salesforce, Informatica. Ensure data security and compliance standards met. Collaborate with cross-functional teams on project delivery.
Posted 3 days ago
12.0 - 17.0 years
12 - 17 Lacs
Pune
Work from Office
Role Overview: The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities: Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills: Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities.
Posted 3 days ago
15.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Lead Data Engineer to join our dynamic team. The Sr. Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field Must have 15+ years of experience as a Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 15 Maximum Work Experience 20 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.
Posted 3 days ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 2 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project specifications, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while actively participating in team discussions to share insights and solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter. - Strong understanding of ETL processes and data integration techniques. - Experience with database management systems and SQL. - Familiarity with application development methodologies and best practices. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 2 years of experience in Informatica PowerCenter. - This position is based at our Hyderabad office. - A 15 years full time education is required.
Posted 3 days ago
7.0 - 11.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skill required: Marketing Operations - Content management Designation: Digital Content Management Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Help balance increased marketing complexity and diminishing marketing resources. Drive marketing performance with deep functional and technical expertise, while accelerating time-to-market and operating efficiencies at scale through Data and Technology, Next Generation Content Services, Digital Marketing Services & Customer Engagement and Media Growth Services. Role requires Digital Marketing Ads & Promotion creation/design Organize, categorize and publish content and information using specific tools and channels, for use by different groups and individuals within the organization. What are we looking for? Ability to work independently and collaboratively with cross-functional teams. Understands the importance of content quality for digital shelf visibility and conversion. Follows client-specific processes, content SLAs, and governance models. Participates in team reviews and communicates status updates clearly and proactively. Strong attention to detail and accuracy. Good understanding of eCommerce platforms (Amazon, Walmart, Nykaa, Flipkart, etc.) and their content guidelines. Familiarity with syndication platforms or PIM tools (Salsify, Syndigo, Alkemics, STEP, Informatica, etc.) is a plus. Strong command of MS Excel and ability to manage trackers; MS Access is a plus. Ability to handle multiple tasks and tight deadlines in a high-pressure environment. Clear written and verbal communication skills. Highly organized, self-motivated, and dependable. Open to working from the office in rotational or fixed shifts. Roles and Responsibilities: Upload, manage, and validate digital product content across eCommerce platforms, retailer portals, or Product Information Management (PIM) systems. Collaborate with creative, copywriting, and brand teams to ensure content alignment and readiness. Conduct quality checks and validations to ensure content accuracy on retailer websites post-publish. Map product attributes based on retailer-specific requirements and support channel-specific formatting. Track and log status updates, content publishing progress, and exceptions using Excel trackers or internal tools. Ensure consistency with brand tone, visual standards, and retailer specifications across categories. Document and update workflow processes, SOPs, and operational guides as needed. Identify content gaps, publish errors, or workflow inefficiencies and escalate appropriately. Handle high volumes of content tasks with a strong focus on accuracy and turnaround time (TAT). Lead team of ecommerce specialist and provide SME support
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Wonderful Job opportunity for Informatica Developer Greetings from LTIMindtree !!! MDM Developer Accountable for delivery of the MDM capabilities required by the business functions Understand specs user stories and product planning docs around MDM implementation Handson experience in Informatica MDM is a must ideally Customer 360 implementation experience Understand Customer Master roadmap aligning with business objectives and outcomes Support in creation and implementation of Data Model that helps support all business use cases Participate to achieve single source of truth from master data perspective Participate in full lifecycle of complex crossfunctional programsprojects with considerable impact across multiple organizations Participate in adoption and implementation of best practices across key data elements and processes Understand business and functional requirements documents Particiapte with business technology and operations on driving data cleanup validation efforts across various systems to achieve clean complete Master Data Identification root cause analysis prioritization remediation planning and coordination and execution of remediation activities for data issues Update system data documentation metadata dictionary lineage in accordance with established policies Help resolve critical issues and providing technical resolution Particiapte in development and maintenance of the technologies that automate data processes Analyze current ETL process define and design new systems Design develop test and deploy ETLMDM processes Work with users and team members at all level for performance improvement and suggestions Ability to convert business requirements into technical specifications and decide timeline to accomplish MDM development and work with Informatica MDM https://forms.office.com/r/x06xaTQHf0
Posted 4 days ago
4.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Designation - Data Governance Specialist Location: Pune Notice Periods : Immediate – 30 days. Mandatory Skill : SAP Master Data Governance (MDG), SAP Data Quality (DQ), Alation Data Quality(DQ)/ Alation Data Governance (DG)/ Alation Data Lineage, Integration/Certification. JOB RESPONSIBILITIES: The role of Data Governance Manager is a first level leadership position within the Finance Data Office's Data Management team. This is a pivotal role for setting up and driving the data governance framework, Data related principles and policies to create a culture of data accountability across all finance data domains. The role is responsible for leading and executing the data governance agenda including data definition, data ownership, data standards, data remediation and master data governance processes across Finance. Data governance manager is expected to partner with the wider data management team in improvement of data quality by implementing data monitoring solutions. The ideal candidate will have a proven track record in working with data governance platforms such as Alation or Collibra for SAP master data domains. This position will take accountability for defining and driving data governance aspects, including leading meetings and data governance forums with Data Stewards, Data Owners, Data Engineers, and other key stakeholders. Coordinating with Data Owners to enable identification of Critical data elements for SAP master Data - Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Define Data governance framework: Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Data Cataloging and Lineage: Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities Collaboration: Work closely with IT, data management teams, and business units to implement data governance best practices and tools. Monitoring and Reporting: Monitor data governance activities, measure progress, and report on key metrics to senior management. Training and Awareness: Conduct training sessions and create awareness programs to promote data governance within the organization. Data structures and models: Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc. ) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Data Policies: Collaborate and coordinate with respective pillar lead's to ensure necessary policies related to data privacy, data lifecycle management and data quality management are being developed JOB REQUIREMENTS: i. Education or Certifications: Bachelor's / Master's degree in engineering/technology/other related degrees. Relevant Professional level certifications from Informatica or SAP or Collibra or Alation or any other leading platform/tools Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience: You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Governance (DQ) and/or Data Quality or Master Data or Data Lineage using relevant tools like Informatica or SAP MDG or Collibra or Alation or any other market leading tools. You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools Client-facing Consulting experience will be considered a plus
Posted 4 days ago
7.0 - 12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Role: -Data Quality(DQ) Specialist Experience: 7-12 years of relevant professional experience in Data Quality and/or Data Governance, Data Management, Data Lineage solution implementation. Location: Pune Notice-Immediate/Max 15 Days Work Mode- 5 days WFO JOB RESPONSIBILITIES: The job entails you to work with our clients and partners to design, define, implement, roll-out, and improve Data Quality that leverage various tools available in the market for example: Informatica IDQ or SAP DQ or SAP MDG or Collibra DQ or Talend DQ or Custom DQ Solution and/or other leading platform for the client’s business benefit. The ideal candidate will be responsible for ensuring the accuracy, completeness, consistency, and reliability of data across systems. You will work closely with data engineers, analysts, and business stakeholders to define and implement data quality frameworks and tools. As part of your role and responsibilities, you will get the opportunity to be involved in the entire business development life-cycle: Meet with business individuals to gather information and analyze existing business processes, determine and document gaps and areas for improvement, prepare requirements documents, functional design documents, etc. To summarize, work with the project stakeholders to identify business needs and gather requirements for the following areas: Data Quality and/or Data Governance or Master Data Follow up of the implementation by conducting training sessions, planning and executing technical and functional transition to support team. Ability to grasp business and technical concepts and transform them into creative, lean, and smart data management solutions. Development and implementation of Data Quality solution in any of the above leading platform-based Enterprise Data Management Solutions Assess and improve data quality across multiple systems and domains. Define and implement data quality rules, metrics, and dashboards. Perform data profiling, cleansing, and validation using industry-standard tools. Collaborate with data stewards and business units to resolve data issues. Develop and maintain data quality documentation and standards. Support data governance initiatives and master data management (MDM). Recommend and implement data quality tools and automation strategies. Conduct root cause analysis of data quality issues and propose remediation plans. Implement/Take advantage of AI to improve/automate Data Quality solution Leveraging SAP MDG/ECCs experience the candidate is able to deep dive to do root cause analysis for assigned usecases. Also able to work with Azure data lake (via dataBricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to monitor on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed important for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with managing implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which KPIs/Measures are stood up that feed into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality KPIs/Measures is needed. Also has experience owing and executing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further guidance/escalation Communication skills are important in this role as this is outward facing and focus has to be on clearly articulation messages. Support designing, building and deployment of data quality dashboards via PowerBI Determines escalation paths and constructs workflow and alerts which notify process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) Works with business functions and projects to create data quality improvement plans Sets targets for data improvements / maturity. Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape JOB REQUIREMENTS: i. Education or Certifications: Bachelor's / Master's degree in engineering/technology/other related degrees. Relevant Professional level certifications from Informatica or SAP or Collibra or Talend or any other leading platform/tools Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience: You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Quality (DQ) and/or Data Governance or Master Data using relevant tools You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools Client-facing Consulting experience will be considered a plus iii. Technical and Functional Skills: Hands-on experience in any of the above DQ tools in the area of enterprise Data Management preferably in complex and diverse systems environments Exposure to concepts of data quality – data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization, enrichment using 3rd party plugins etc.) etc. Strong understanding of data quality best practices, concepts, data quality management frameworks and data quality dimensions/KPIs Deep knowledge on SQL and stored procedure Should have strong knowledge on Master Data, Data Governance, Data Security Prefer to have domain knowledge on SAP Finance modules Good to have hands on experience on AI use cases on Data Quality or Data Management areas Prefer to have the concepts and hands on experience of master data management – matching, merging, creation of golden records for master data entities Strong soft skills like inter-personal, team and communication skills (both verbal and written)
Posted 4 days ago
4.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The role of Data Governance Manager is a first level leadership position within the Finance Data Office’s Data Management team. This is a pivotal role for setting up and driving the data governance framework, Data related principles and policies to create a culture of data accountability across all finance data domains. The role is responsible for leading and executing the data governance agenda including data definition, data ownership, data standards, data remediation and master data governance processes across Finance. Data governance manager is expected to partner with the wider data management team in improvement of data quality by implementing data monitoring solutions. The ideal candidate will have a proven track record in working with data governance platforms such as Alation or Collibra for SAP master data domains. This position will take accountability for defining and driving data governance aspects, including leading meetings and data governance forums with Data Stewards, Data Owners, Data Engineers, and other key stakeholders. Coordinating with Data Owners to enable identification of Critical data elements for SAP master Data – Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Define Data governance framework: Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Data Cataloging and Lineage: Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaboration: Work closely with IT, data management teams, and business units to implement data governance best practices and tools. Monitoring and Reporting: Monitor data governance activities, measure progress, and report on key metrics to senior management. Training and Awareness: Conduct training sessions and create awareness programs to promote data governance within the organization. Data structures and models: Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Data Policies: Collaborate and coordinate with respective pillar lead’s to ensure necessary policies related to data privacy, data lifecycle management and data quality management are being developed JOB REQUIREMENTS: i. Education or Certifications: Bachelor's / Master's degree in engineering/technology/other related degrees. Relevant Professional level certifications from Informatica or SAP or Collibra or Alation or any other leading platform/tools Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience: You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Governance (DQ) and/or Data Quality or Master Data or Data Lineage using relevant tools like Informatica or SAP MDG or Collibra or Alation or any other market leading tools. You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools Client-facing Consulting experience will be considered a plus iii. Technical and Functional Skills: Hands-on experience in any of the above tools in the area of Enterprise Data Governance preferably in SAP or complex and diverse systems environments Experience of implementing data governance in SAP environment both transactional and master data Expert knowledge of data governance concepts around data definition and catalog, data ownership, data lineage, data policies and controls, data monitoring and data governance forums Strong knowledge on SAP peripheral systems and good understanding of Upstream and downstream impact of master Data Exposure to concepts of data quality – data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization, enrichment using 3 rd party plugins etc.) etc. Strong understanding of data quality best practices, concepts, data quality management frameworks and data quality dimensions/KPIs Deep knowledge on SQL and stored procedure Should have strong knowledge on Master Data, Data Security Prefer to have domain knowledge on SAP Finance modules Good to have hands on experience on AI use cases on Data Quality or Data Governance or other Management areas Prefer to have the concepts and hands on experience of master data management – matching, merging, creation of golden records for master data entities Strong soft skills like inter-personal, team and communication skills (both verbal and written) Prefer to have - Project management, Domain knowledge [Procurement, Finance, Customer], Business Acumen, Critical thinking, Story telling
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Technical Sales Support Engineer Location: Bengaluru, Karnataka Experience: 5+ Years Education: Bachelor’s degree in computer science, Engineering, or related field About the Role: As a Technical Sales Support Engineer for the Global Technical Sales Environment, you will be responsible for the management and optimization of cloud resources that support technical sales engagements. This role involves provisioning, maintaining, and enhancing the infrastructure required for POCs, workshops, and product demonstrations for the technical sales community. Beyond infrastructure management, you will play a critical role in automation, driving efficient deployments, optimizing cloud operations, and developing tools to enhance productivity. Security will be a key focus, requiring proactive identification and mitigation of vulnerabilities to ensure compliance with enterprise security standards. Expertise in automation, scripting, and infrastructure development will be essential to deliver scalable, secure, and high-performance solutions, supporting customer, prospect, and partner engagements. Key Responsibilities: Cloud Infrastructure Management & Support of TechSales activities: Install, upgrade, configure, and optimize the Informatica platform, both on-premises and Cloud platform runtime environments. Manage the configuration, security, and networking aspects of Informatica Cloud demo platforms and resources. Coordinate with Cloud Trust Operations to ensure smooth implementation of Informatica Cloud Platform changes. Monitor cloud environments across AWS, Azure, GCP, and Oracle Cloud to detect potential issues and mitigate risks proactively. Analyse cloud resource utilization and implement cost-optimization strategies while ensuring performance and reliability. Security & Compliance: Implement security best practices, including threat monitoring, server log audits, and compliance measures. Work towards identifying and mitigating vulnerabilities to ensure a robust security posture. Automation & DevOps Implementation: Automate deployments and streamline operations using Bash/Python, Ansible, and DevOps methodologies. Install, manage, and maintain Docker containers to support scalable environments. Collaborate with internal teams to drive automation initiatives that enhance efficiency and reduce manual effort. Technical Expertise & Troubleshooting: Apply strong troubleshooting skills to diagnose and resolve complex issues on Informatica Cloud demo environments, Docker containers, and Hyperscalers (AWS, Azure, GCP, OCI). Maintain high availability and performance of the Informatica platform and runtime agent. Manage user roles, access controls, and permissions within the Informatica Cloud demo platform. Continuous Learning & Collaboration: Stay updated on emerging cloud technologies and automation trends through ongoing professional development. Work closely with Informatica support to drive platform improvements and resolve technical challenges. Scheduling & On-Call Support: Provide 24x5 support as per business requirements, ensuring seamless operations. Role Essentials: Automation & DevOps Expertise: Proficiency in Bash/Python scripting. Strong understanding of DevOps principles and CI/CD pipelines. Hands-on experience in automation tools like Ansible. Cloud & Infrastructure Management: Experience in administering Cloud data management platforms and related SaaS. Proficiency in Unix/Linux/Windows environments. Expertise in cloud computing platforms (AWS, Azure, GCP, OCI). Hands-on experience with Docker, Containers, and Kubernetes. Database & Storage Management: Experience with relational databases (MySQL, Oracle, Snowflake). Strong SQL skills for database administration and optimization. Monitoring & Observability: Familiarity with monitoring tools such as Grafana. Education & Experience: BE or equivalent educational background, with a combination of relevant education and experience being considered. Minimum 5+ years of relevant professional experience. This role offers an opportunity to work in a dynamic, cloud-driven, and automation-focused environment, contributing to the seamless execution of technical sales initiatives. Preferred Skills: Experience in administering Informatica Cloud (IDMC) and related products. Experience with storage solutions like Snowflake, Databricks, Redshift, Azure Synapse and improving database performance. Hands-on experience with Informatica Platform (On-Premises).
Posted 4 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bengaluru, KA, IN Company: ExxonMobil About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team UDO DPF Data Engineer will lean in and own the work, connect with others, be resourceful, engage in data communities, apply technical growth, bring enthusiasm and commitment. They will be essential part of a data squad, a small composition of data gurus specifically assigned to a capability, developing domain knowledge and understanding of the data within business workflows so that each and every data product is done right and delights the customer. City: Bengaluru, Karnataka What you will do Perform ETL, ELT operations and administration using modern tools, programming languages and systems securely and in accordance with enterprise data standards Assemble, model, transform large complex sets of data that meet non-functional and functional business requirements into a format that can be analyzed Automate data processing of data from multiple data sources Develop, deploy and version control code for data consumption, reuse for APIs Employ machine learning techniques to create and sustain data structures Perform root cause analysis on external and internal processes and data to identify opportunities for improvement, resolve data quality issues Lead data-related workshops with stakeholders to capture data requirements and acceptance criteria Mentorship & Coaching Jr data engineers by providing guidance on best practices in coding, data modeling, and pipeline design. Etc. About You Skills and Qualifications Minimum bachelor’s degree in: Data Science, Business Intelligence, Statistics, Computer Engineering or related field, or the equivalent combination of education, professional training, and work experience Petroleum, Chemical, Mechanical, Civil/construction Engineering is acceptable given appropriate work experience 5 or more years performing duties related to data engineering Expert proficiency in at least one of these programming languages: Python(Must), NoSQL, SQL, R, and competent in source code management Build processes supporting data transformation, data structures, metadata, dependency, and workload management Create data validation methods and data analysis tools Preferred Qualifications/ Experience Excellent problem-solving skills and ability to learn through scattered resources Automate routine tasks via scripts, code Capacity to successfully manage a pipeline of duties with minimal supervision Experience supporting and working with cross-functional teams in a dynamic environment Modify existing reports, extracts, dashboards, and cubes as necessary Commitment to operations integrity and ability to hold self and others accountable for results Data Governance skills: Data Quality Management, Metadata Management, Data Lineage & Provenance, Master Data Management (MDM), Data Cataloging Tools Experience with tools like Collibra, Alation, Azure Purview, Informatica, or Google. Data Catalog, Data Classification & Tagging Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Job Segment: Sustainability, Construction, Construction Engineer, CSR, Database, Energy, Engineering, Management, Technology
Posted 4 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team UDO DPF Data Engineer will lean in and own the work, connect with others, be resourceful, engage in data communities, apply technical growth, bring enthusiasm and commitment. They will be essential part of a data squad, a small composition of data gurus specifically assigned to a capability, developing domain knowledge and understanding of the data within business workflows so that each and every data product is done right and delights the customer. City: Bengaluru, Karnataka What you will do Perform ETL, ELT operations and administration using modern tools, programming languages and systems securely and in accordance with enterprise data standards Assemble, model, transform large complex sets of data that meet non-functional and functional business requirements into a format that can be analyzed Automate data processing of data from multiple data sources Develop, deploy and version control code for data consumption, reuse for APIs Employ machine learning techniques to create and sustain data structures Perform root cause analysis on external and internal processes and data to identify opportunities for improvement, resolve data quality issues Lead data-related workshops with stakeholders to capture data requirements and acceptance criteria Mentorship & Coaching Jr data engineers by providing guidance on best practices in coding, data modeling, and pipeline design. Etc. About You Skills and Qualifications Minimum bachelor’s degree in: Data Science, Business Intelligence, Statistics, Computer Engineering or related field, or the equivalent combination of education, professional training, and work experience Petroleum, Chemical, Mechanical, Civil/construction Engineering is acceptable given appropriate work experience 5 or more years performing duties related to data engineering Expert proficiency in at least one of these programming languages: Python(Must), NoSQL, SQL, R, and competent in source code management Build processes supporting data transformation, data structures, metadata, dependency, and workload management Create data validation methods and data analysis tools Preferred Qualifications/ Experience Excellent problem-solving skills and ability to learn through scattered resources Automate routine tasks via scripts, code Capacity to successfully manage a pipeline of duties with minimal supervision Experience supporting and working with cross-functional teams in a dynamic environment Modify existing reports, extracts, dashboards, and cubes as necessary Commitment to operations integrity and ability to hold self and others accountable for results Data Governance skills: Data Quality Management, Metadata Management, Data Lineage & Provenance, Master Data Management (MDM), Data Cataloging Tools Experience with tools like Collibra, Alation, Azure Purview, Informatica, or Google. Data Catalog, Data Classification & Tagging Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.
Posted 4 days ago
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana
On-site
Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Data Analyst Position: Senior Software Engineer - Data Analyst Experience: 3 to 6 Years Main location: India, Telangana, Hyderabad Position ID: J0525-1616 Shift: General Shift (5 Days WFO for initial 8 weeks) Employment Type: Full Time Your future duties and responsibilities Design, develop, and optimize complex SQL queries for data extraction, transformation, and loading (ETL). Work with Teradata databases to perform high-volume data analysis and support enterprise-level reporting needs. Understand business and technical requirements to create and manage Source to Target Mapping (STM) documentation. Collaborate with business analysts and domain SMEs to map banking-specific data such as transactions, accounts, customers, products, and regulatory data. Analyze large data sets to identify trends, data quality issues, and actionable insights. Participate in data migration, data lineage, and reconciliation processes. Ensure data governance, quality, and security protocols are followed. Support testing and validation efforts during system upgrades or new feature implementations. Required qualifications to be successful in this role Advanced SQL – Joins, subqueries, window functions, performance tuning. Teradata – Query optimization, utilities (e.g., BTEQ, FastLoad, MultiLoad), DDL/DML. Experience with ETL tools (e.g., Informatica, Talend, or custom SQL-based ETL pipelines). Hands-on in preparing STM (Source to Target Mapping) documents. Familiarity with data modeling and data warehouse concepts (star/snowflake schema). Proficient in Excel and/or BI tools (Power BI, Tableau, etc.) for data visualization and analysis. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 4 days ago
10.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Information Date Opened 07/29/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Lead Cloud Data Engineer / Technical Architect Experience: 10+ Years Location - Hyderabad Job Summary: We are seeking a highly skilled and experienced Cloud Data Engineer with a strong foundation in AWS, data warehousing, and application migration. The ideal candidate will be responsible for designing and maintaining cloud-based data solutions, leading teams, collaborating with clients, and ensuring smooth migration of on-premises applications to the cloud. Key Responsibilities: Engage directly with clients to understand requirements, provide solution design, and drive successful project delivery. Lead cloud migration initiatives, specifically moving on-premise applications and databases to AWS cloud platforms. Design, develop, and maintain scalable, reliable, and secure data applications in a cloud environment. Lead and mentor a team of engineers; oversee task distribution, progress tracking, and issue resolution. Develop, optimize, and troubleshoot complex SQL queries and stored procedures. Design and implement robust ETL pipelines using tools such as Talend , Informatica , or DataStage . Ensure optimal usage and performance of Amazon Redshift and implement performance tuning strategies. Collaborate across teams to implement best practices in cloud architecture and data management. Requirements Required Skills and Qualifications: Strong hands-on experience with the AWS ecosystem , including services related to storage, compute, and data analytics. In-depth knowledge of data warehouse architecture and best practices. Proven experience in on-prem to cloud migration projects . Expertise in at least one ETL tool : Talend, Informatica, or DataStage . Strong command of SQL and Stored Procedures . Practical knowledge and usage of Amazon Redshift . Demonstrated experience in leading teams and managing project deliverables. Strong understanding of performance tuning for data pipelines and databases. Good to Have: Working knowledge or hands-on experience with Snowflake . Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Benefits As per company standards.
Posted 4 days ago
3.0 - 6.0 years
0 Lacs
Hyderabad, Telangana
On-site
Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0525-1616 Employment Type: Full Time Position Description: Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Data Analyst Position: Senior Software Engineer - Data Analyst Experience: 3 to 6 Years Main location: India, Telangana, Hyderabad Position ID: J0525-1616 Shift: General Shift (5 Days WFO for initial 8 weeks) Employment Type: Full Time Your future duties and responsibilities: Design, develop, and optimize complex SQL queries for data extraction, transformation, and loading (ETL). Work with Teradata databases to perform high-volume data analysis and support enterprise-level reporting needs. Understand business and technical requirements to create and manage Source to Target Mapping (STM) documentation. Collaborate with business analysts and domain SMEs to map banking-specific data such as transactions, accounts, customers, products, and regulatory data. Analyze large data sets to identify trends, data quality issues, and actionable insights. Participate in data migration, data lineage, and reconciliation processes. Ensure data governance, quality, and security protocols are followed. Support testing and validation efforts during system upgrades or new feature implementations. Required qualifications to be successful in this role: Advanced SQL – Joins, subqueries, window functions, performance tuning. Teradata – Query optimization, utilities (e.g., BTEQ, FastLoad, MultiLoad), DDL/DML. Experience with ETL tools (e.g., Informatica, Talend, or custom SQL-based ETL pipelines). Hands-on in preparing STM (Source to Target Mapping) documents. Familiarity with data modeling and data warehouse concepts (star/snowflake schema). Proficient in Excel and/or BI tools (Power BI, Tableau, etc.) for data visualization and analysis. Skills: Data Analysis Data Modeling Informatica SQLite GIT GIT What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 4 days ago
5.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
You have extensive IT experience of 10-15 years in implementing BI solutions using Oracle Data Technologies such as ODI, OCI-Data Integrations, OIC, ADW, OBIA/FAW, and OAC/Tableau. Additionally, you possess Oracle Cloud Expertise of at least 5+ years with Oracle Cloud-based data technologies and services including OAC, ADW, Object Storage, FAW, and Data Lake on OCI. Your experience includes EBS or Fusion ERP Analytics in at least two end-to-end project implementations. You have strong industry domain knowledge expertise in Life Sciences, Healthcare, Pharmacy, Banking, and Finance. In this role, you will work closely with business users, IT teams, and management to understand data requirements and reporting needs. Your responsibilities will include designing data warehouse architecture, including data models, ETL processes, and Oracle Cloud Native Integration Flows. You will be a Subject Matter Expert & Onshore coordinator providing guidance to OAC & ODI teams and collaborating with functional teams on data models and source table configurations. It will be crucial for you to ensure data models align with business requirements and support reporting and analytical needs. You will work with developers, DBAs, and IT staff to implement data warehouse solutions according to the designed architecture. Additionally, you will be responsible for creating and maintaining detailed documentation for architecture, data models, ETL processes, and configurations. Providing end-user training and support for reporting and analysis, as well as assisting with troubleshooting, will also be part of your responsibilities. It is essential to stay updated on the latest data warehousing technologies and continuously seek opportunities for performance and process improvements. You should have primary skills in Data Warehousing, Data Modeling, SQL, FAW/FDI, Data Visualizations like Tableau, Oracle DV, OAC/OAS, and Data Integrations like ODICS/ODI, Informatica, ADF/SSIS. Good to have experience in BI, Data Warehousing, Data Modeling design, and BI Implementations. Knowledge of cloud computing platforms and deployment (Oracle Cloud, AWS, GCP, Azure) will be beneficial. It is also good to have Data Engineering/ETL skill set in any of the technological stack like Azure Databricks, Snowflake, etc. If you meet the requirements mentioned above and are interested in this position, please share your resume to: jobs@360extech.com,
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough